HatchWorks is searching for an experienced data-driven Senior Data Engineer with deep knowledge developing enterprise data solutions to solve mission-critical business needs for our clients. A Senior Data Engineer within HatchWorks will deliver successful projects by providing skilled technical expertise, leveraging strong interpersonal communication skills, and fostering deep collaboration in an Agile software development environment. The Senior Data Engineer will assist in translating business requirements into technical requirements and must have a track record of successful and applicable projects.

We are HatchWorks Technologies

We are innovators, technologists, and builders - all dedicated to creating intelligent purpose-built software products and solutions that improve the way people work and live. Our solutions drive revenue, market share, operational efficiencies, and most importantly delightful user experiences for industry leaders in healthcare, financial services, and communications to name a few.

Our key differentiator is our product-centric approach putting the end user first. You will work with user-obsessed experts who always start with “why” before “what”, and aspire to build feasible solutions that are viable for our customers' business and valuable for the end user [alt text - and aspire to build the right solution for the right audience]. We focus on outcomes over output and believe in accelerating time to value for our customers in an agile focused collaborative manner. The fabric behind all of this is our people, culture, and core values holding us all accountable to each other.

Responsibilities:

  • Create and enhance data solutions enabling seamless delivery of data and is responsible for collecting, parsing, managing, and analyzing large sets of data across different domains for analysis.
  • Designs and develops data pipelines, data ingestion and ETL processes that are scalable, repeatable, and secure for stakeholder needs.
  • Build Data architecture to support data management strategies to support business intelligence initiatives and actionable insights for business stakeholders.
  • Develops real-time and batch ETL data processes aligned with business needs, manages and augments data pipeline from raw OLTP databases to data solution structures.
  • Support the Agile Scrum team with planning, scoping, and creation of technical solutions for the new product capabilities through continuous delivery to production.

Qualifications:

  • 5+ years of experience using Snowflake.
  • 2+ years of Python experience and handling of CSV, JSON, and Parquet files using boto3 and pandas.
  • Experience building data ingestion pipelines.
  • Experience working PostgreSQL, MySQL, Oracle, and AWS Athena.
  • Knowledge of AWS services such as S3, Lambdas, Fargate, Step Functions, SQS, SNS, and Cloudwatch.
  • Experience working with Git using Gitlab devops platform.
  • Experience with CI/CD pipelines using Jenkins.
  • Knowledge of relational database skills including the ability to create queries and stored procedures.
  • Data modeling skills, both dimensional and relational.
  • Experience with Agile software delivery methodologies.
  • Ability to read, write, understand, and speak English at B2 level or higher.