Cloud Data Engineer, Copenhagen
My client is looking for someone to have several years of experience developing highly scalable data solutions ground-up in Azure and who can thrive in a fast-paced Agile environment!
My client, one of the best and fastest growing Azure & Data Analytic companies in Denmark, is currently helping numerous end-users embark on digital transformation journeys from fragmented local BI environments to a corporate-wide cloud data hubs.
As a Cloud Data Engineer, you will be responsible for developing the infrastructure that provides insight from raw data.
You will, among other things, architect, design, and implement individual cloud data hub components in close collaboration with your colleagues in the team.
Your primary tasks will include the following:
- Engineering individual cloud data hub components as per solution design
- Developing and maintaining cloud pipelines for data processing, management, and transitioning
- Writing and maintaining secure, robust, scalable, and efficient code that turns business concepts into tangible solutions
- Ensuring the quality of the solution by implementing manual and automated unit and integration test
- Responding to technical issues in a professional and timely manner
- Driving engineering best practices like Automation, CI/CD, Infrastructure as Code, and DevOps
- Participating in Agile ceremonies, e.g., sprint planning, backlog refinement, retrospectives, and demos
- Mentoring and coaching your colleagues on the best practices of cloud data engineering
- Setting and maintaining high standards of quality in the team.
- Accelerate innovation by driving cloud-native design, PaaS paradigm, T-shaped mindset, and end-to-end accountability.
The ideal candidate would be a data engineer with skills within Azure data components.
More specifically, we expect you to have a successful track record in most of the following areas:
- Architect and developing enterprise-level data platforms in Azure
- Experience with Azure data components ecosystem, e.g., Synapse, Data Factory, Data Lake, Data Bricks
- Designing and implementing data pipelines, ETL/ELT processes
- Knowledge of SQL and Transact-SQL.
- Writing unit and integration tests for stored procedures and data transformation pipelines
- Working knowledge of Microsoft data management tools including Power BI, PowerPivot
Furthermore, one or more of the following skills will be considered an advantage:
- Knowledge of Data Analysis Extensions (DAX)
- Building CI/CD pipelines using Azure DevOps
- Best practices of working with Git including GitFlow and Pull Requests
- Scripting, e.g., PowerShell.
- Event streaming knowledge, e.g., Kappa architecture, Microsoft Event / IoT Hub.
Interested to hear more?
Call Eddie on 0045 88 74 11 02 or Mail Eddie on firstname.lastname@example.org