• Lokalizacja:: Chicago, Illinois
  • Wynagrodzenie: $125000 - $135000 per annum
  • Technologia: Microsoft Azure Jobs
  • Rodzaj pracy: Permanent
  • Data wystawienia: 30th Sep, 2021
  • Referencje: NPD DO 9/30/21

Job Title: Senior Big Data Engineer
Job Type: Permanent
Location: Remote
Salary: $125K-$135K Benefits

MUST HAVE: The Sr. Big Data Engineer must have strong hands-on technical skills as well as being able to mentor and train other Engineers on conventional ETL and SQL skills with programming and data science languages such as Python and R, using big data techniques.


The Sr. Big Data Engineer will be the senior person responsible for big data engineering, data wrangling, data analysis and user support primarily focused on the Cloudera Hadoop platform, but in future extending to the cloud. The role will also play a role in defining and implementing Big Data Strategy for the organisation along with driving implementation of IT solutions for the business.

Required Qualifications:

  • 7 years of hands-on experience with big data

  • Bachelor's or advanced degree in a technology field or equivalent experience

  • Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data engineering challenges

  • Strong object-oriented programming skills

  • Ability to take complex data and communicate in a manner that is understandable to all audiences

  • Understands the evolving landscape of technology and its effect on clients and data products

  • Maximises profitable growth by seeking efficiency in systems and processes

  • Delivers optimal solutions to meet client and company needs

  • Ability to communicate effectively with wide range of audiences, both verbally and in writing

  • Demonstrated success in partnering with cross-functional departments and teams to achieve business objectives

  • Ability to effectively network with colleagues to share knowledge and gain new perspectives

  • Strong communication (verbal and written) and client service skills. Effective interpersonal, communication, and presentation skills applicable to a wide audience including senior/executive management, clients, and peers

  • Responds well to a bottom-up management approach, supporting a culture of creativity and innovation

  • Demonstrated understanding of the evolving landscape of technology

  • Can effectively communicate business strategy and objectives

  • Able to strike an effective balance between focus on strategic priorities and near-term operational priorities

Job Duties:

  • Proactively analyse the business needs, profile large data sets and build custom data models and applications to drive business decision making and customer experience

  • Build workflows that empower analysts to efficiently use data

  • Develop and extend design patterns, processes, standards, frameworks, and reusable components for various data engineering function areas

  • Perform requirements analysis, planning and forecasting for Hadoop data engineering/ingestion projects

  • Design optimised Hadoop and big data solutions for data ingestion, data processing, data wrangling, and data delivery

  • Design, develop tune data products, streaming applications, and integration's on large-scale data platforms (Hadoop, Kafka Streaming, Hana, SQL server, Data warehousing, big data, etc.( with an emphasis on performance, reliability and scalability, and most of all quality.

  • Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.

  • Build the infrastructure required for efficient extraction, transformation, and loading of data from a wide variety of data sources.

  • Build data tools for analytics and data scientist team members that assist them in building and optimising our product into an innovative industry leader

  • Develop custom data models and algorithms

  • Identify opportunities for data acquisition

  • Peer review of the code developed by team members

  • Propose recommendations to streamline processes for efficiency and effectiveness

  • Strong execution of solutions to meet client and company needs

  • Work in multi-functional agile teams to continuously experiment, iterate and execute on data-driven product objectives

  • Identify and resolve day-to-day issues to ensure continues improvement

  • Network with colleagues to share knowledge and gain new perspectives (Mgmnt/IC Track)

Technical Competencies (Continued):

  • Knowledge of Programming/Scripting Languages including experience in Python/Java/Scala; Must have experience in object-oriented programming concepts and Shell Scripting

  • Knowledge of Skills and Design Patterns including Statistics, Data Visualizations and Microservices preferred; Must have experience in REST API, Data Modeling and Performance Tuning

  • Knowledge of RDBMS/Databases including experience in ANSI SQL; Experience in Oracle/MySQL/Sybase; Knowledge of MongoDB (Object Stores) and Snowflake

  • Experience in Job Scheduling including ControlM, Contrab, Airflow or Autosys

  • Experience with Libraries including Python Pandas, Python ETL, Python or Java

  • Must have experience in Hadoop/Big Data including HDFS, Spark, Hive/Spark SQL, Sqoop and Data warehousing; Knowledge of HBase, Phoenix, Datameer (or other analytical tools), PowerBI (or other reporting tool i.e., Tableau) and SAS/Dataflex preferred

  • Experience in Cloud software including Azure/AWS/Google Cloud; Must have experience in Data Factory or other Cloud ETL tools and Cloud Data Lake (Storage); Knowledge of Databricks; Knowledge of Azure Batch, Delta Lake, Azure HDInsight, Cosmos DB, Azure EventHub and Synapse preferred

  • Knowledge of Backoffice Full Stack including Servlet, Spring Framework, HTML, JavaScript, CSS, React and HighCharts preferred

  • Must have experience in OS software including Linux/UNIX

  • Must have experience in Project Management/Agile including SVN/GIT/GIThub; Knowledge of Azure DevOps and CICD preferred

  • Knowledge of Tools including experience in IDE - IntelliJ/Eclipse/PyCharm/Visual Studio and DBeaver/SQL developer or other SQL development tools; Knowledge of Hortonworks, Cloudera - Ambari and Trifacta preferred


  • Health/Dental/Vision

  • PTO

  • Advancement opportunities with structured career paths

  • Work with some of the world's most successful brands and retailers

  • Remote

Please email d.orihuela@nigelfrank.com for immediate consideration!