GCP Data Architect
Our Client has asked us to fill a team of experienced GCP Professionals... 2 x Data Engineers, 2 x Data Architect and 2 x Migration Architects.
Roles & Responsibilities':
· Experience in technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment.
· Experience in architecting, developing, and deploying scalable enterprise data analytics solutions (Enterprise Data Warehouses, Data Marts, etc)
· Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume)
· Experience in designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.
· hands-on experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on Google cloud using GCP/3rd party services
· Experience in designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.
· Experience in architecting and implementing next generation data and analytics platforms on GCP cloud
· Experience in designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming
Hit Apply or email me directly - email@example.com - (John)