Derek Scott

Senior Data Engineer - ETL/Hadoop

4/30/2019

APPLY HERE

Description

A Senior Data Engineer is responsible for developing data warehouses, pipelines, and infrastructure to drive intelligent applications and products in the healthcare industry as part of Humana’s Boston Experience Center. The Senior Data Engineer will work closely with the data science and product teams while working in a big data environment on Cloud Platforms (including Google Cloud Platform, Microsoft Azure, and Hadoop).


The Experience Center is transforming Humana into a human-centered, tech-driven company. We are accomplishing this by developing experience-first healthcare software rapidly, flexibly and iteratively. We are a technology incubator for the company and are expanding our capabilities by integrating Data Engineering and Data Science into the balanced team model. The Senior Data Engineer will work alongside data scientists, software developers, designers, and product managers to identify, validate, and adopt new technologies, tools, and practices, and turn these into human-centered experiences for Humana members.


The Data Engineer will be responsible for:

  • Designing solutions and supporting the installation, customization, and integration of the solution.
  • Constructing new data warehouse platforms, primarily in cloud infrastructure, and developing pipelines and ETL processes to support products and solutions.
  • Productionizing data engineering solutions, including models, pipelines, and infrastructure to deploy solutions to customers.
  • Develops and documents strategies, policies and best practices. Prepare technical design and implementation design documents where needed.
  • Work alongside product management to ensure all project requirements are met.
  • Provides project team leadership.        

               

Required Qualifications

  • Hands-on experience in implementing large and fast data platforms, and ability to deliver onsite proof of concepts for distributed databases.
  • Experience working on, optimizing, and tuning performance of common databases: MySQL, Postgres, Any NoSQL.
  • Experience working with Cloud environments and databases (GCP, Azure, AWS).
  • Thorough understanding of big data and execution experience with Enterprise Data Warehouses (EDWs) and knowledge of BI/DW solutions.
  • Experience doing ETL and utilizing industry-standard ETL tools.  Familiarity with data-oriented scripting languages: shell, Python/R/Java/Scala/Perl.
  • Familiarity working with big data formats (Avro, Parquet, Orc, etc).
  • Experience constructing and using Data Pipelining and Data Orchestration (Apache Airflow/Nifi, Pentaho, etc).
  • Experience productionizing data models and pipelines.
  • Experience working with multiple operating systems, Linux/Unix in particular.
  • Behavioral competencies:
    • Positive, winning attitude.
    • Passion for continual learning.
    • Excellent track record of customer responsiveness.
    • Presents themselves well in customer settings.
    • Operates with a sense of urgency.
    • Upholds honesty and integrity.
  • Ability to work in a collaborative environment:
    • Data Engineers will be members of and work on a Balanced Team (including Product Managers, Designers, Software Engineers, and Data Scientists).
    • Products are developed in an extreme programming environment.
    • Code and pipelines should adhere to Test-Driven Development (TDD)
    • Engineers typically work in a paired programming environment.
  • Familiarity with data modeling approaches such as ERD and concepts behind relational, normalized and star schema database architectures.
  • Ability to travel as-needed to meet with team members (typically less than 5% travel).


Preferred Qualifications

  • Experience in DevOps (Team City, Azure DevOps)
  • Experience working on the Google Cloud Platform (GCP).
  • Experience with Message Brokers (RabbitMQ, Kafka, Pub/Sub)
  • Experience working in Professional Services / Consulting on high impact, data-oriented projects with smaller, independent project teams.
  • Experience with BI Reporting and Analytical Tools such as Tableau, Business Objects, SAS, Cognos, etc.
  • VMware or Virtualization technology experience
  • Structured design, coding, and testing experience


Scheduled Weekly Hours

40 

Job Type : ""
Education Level : ""
Experience Level : ""
Job Function : ""
Powered By GrowthZone