nuovoRif.: a0MP9000009UvdB.2_1773764114

Freelance Data Engineer

Sweden

  • SEK 800 to SEK 950 SEK
  • Consultant Ruolo
  • Competenze: Data Engineering, Data Migration, GCP, Databricks, Big Query
  • Livello: Mid-level

Descrizione posizione

Freelance Data Engineer

a0MP9000009UvdB.2_1773764114

Job Description: GCP Data Engineer

About the Role

We're looking for a highly skilled GCP Data Engineer to join our growing data function. In this role, you'll design, build, and optimise scalable data pipelines and cloud-based solutions using Google Cloud Platform. You'll work closely with product, analytics, and engineering teams to deliver clean, reliable, high‑quality data that powers decision‑making across the organisation.

This is an exciting opportunity for someone who enjoys end‑to‑end ownership, solving complex data challenges, and shaping modern cloud architectures.

Key Responsibilities

Data Engineering & Pipeline Development

* Design, build, and maintain scalable ETL/ELT pipelines using GCP-native services.
* Develop batch and real‑time data processing solutions with Dataflow and Pub/Sub.
* Implement efficient, reliable workflows via Cloud Composer (Airflow).
* Build and optimise data transformations using Dataform or dbt (optional).

Data Architecture & Modelling

* Design logical and physical data models for analytics and application use‑cases.
* Implement best practices across data warehousing, partitioning, and performance tuning in BigQuery.
* Ensure data quality, observability, and governance across systems.

Cloud Engineering & Infrastructure

* Deploy scalable and cost‑efficient solutions using GCP services (GCS, Dataproc, Cloud Functions, Cloud Run).
* Use Terraform to manage cloud resources as code.
* Collaborate with DevOps/Platform teams on CI/CD and containerisation (Docker, GKE).

Collaboration & Stakeholder Engagement

* Partner with data analysts, scientists, and cross‑functional teams to understand requirements.
* Translate business needs into robust technical solutions.
* Contribute to ongoing improvements in engineering standards, documentation, and best practices.

Required Skills & Experience

* Strong hands‑on experience with Google Cloud Platform, including:

* BigQuery
* Dataflow (Apache Beam)
* Pub/Sub
* Cloud Composer
* Cloud Storage (GCS)
* Dataproc

* Strong programming skills in Python and SQL.
* Experience building large‑scale ETL/ELT pipelines.
* Solid understanding of data modelling, data warehousing principles, and analytics workflows.
* Experience with Terraform or equivalent IaC tooling.
* Familiarity with CI/CD, Git, Docker, and containerised workloads.
* Strong problem-solving skills and ability to work with cross-functional stakeholders.