About the Gig:
You'll help design and build cloud-based data pipelines that power critical business decisions. They're all about using Google Cloud to create smart, reliable systems that handle tons of data. This is a great opportunity to grow your career while working alongside talented people in a collaborative, fast-paced environment.
Contract length: 6-12 months initially
Roles and Responsibilities:
*
Build and maintain scalable data pipelines using GCP tools like BigQuery, Dataflow, Pub/Sub, and Cloud Composer
*
Develop and optimize data models for reporting and analytics
*
Work closely with data scientists, analysts, and engineers to ensure data quality across systems
*
Help improve the overall architecture and reliability of the data platform
Skills and Qualifications:
*
5+ years of experience as a data engineer with hands-on experience using GCP
*
Strong Python and SQL skills, with a focus on performance
*
Experience with orchestration tools like Airflow (Cloud Composer) and infrastructure-as-code (e.g., Terraform)
*
A good understanding of data security and compliance
*
A team player who loves to collaborate and solve problems
Nice to Have:
*
Experience with CI/CD, version control (Git), and containerization (Docker, Kubernetes)
*
A background in financial services or other regulated industries
*
Familiarity with real-time data processing
Why work for them?
*
Hybrid work from our Stockholm HQ
*
Be part of a tech team making a real impact in the finance industry
*
Work with the latest cloud-native tools and tech
*
A stable and supportive environment where your growth matters
Sounds like your kind of Gig?! Send me a DM or apply below..
