As a Data Engineer on the Data Analytics team, you will play a pivotal role in building and maintaining robust, reliable, and scalable data infrastructure.
This position is critical for ensuring our data infrastructure keeps pace with our analytical needs, especially as Human Interest scales to meet the demands of a rapidly growing company.
You will contribute to improving our data foundation and future-proofing our capabilities for advanced analytics and analytics self-service.
The Data Analytics team currently consists of Data Analysts focused on analytics engineering, complex data analysis, and data science.
This role will be embedded directly within the Data Analytics team, with lines of mentorship to the decentralized data engineers who helped build the analytics infrastructure to where it is today.
You will have the opportunity to mentor analysts desiring to work further upstream in the data stack and learn from their domain expertise, contributing to a collaborative and growth-oriented environment positioned to have great business impact.
Daily responsibilities include building and optimizing data models in dbt Core, designing and maintaining scalable data ingestion and orchestration, managing data infrastructure in AWS using Terraform, collaborating with Data Analysts and Software Engineers, identifying improvements in data orchestration, developing new data ingestion pipelines, and implementing efficient testing within dbt.
Requirements:
You must have 3+ years of experience as a Data Engineer with a strong focus on data pipeline development and data warehousing, consistently delivering high-quality work on a timely basis.
Strong hands-on experience with data modeling is required, along with knowledge of general design patterns and architectural approaches.
You should have hands-on experience with cloud data warehouses.
Strong Python and SQL skills are necessary, with experience in data manipulation and analysis, capable of quickly absorbing and synthesizing complex information.
Experience with data ingestion tools and ETL/ELT processes is required.
You must have experience with Airflow.
A proactive mindset to identify areas for improvement in our data infrastructure is essential.
You should be able to independently define projects and clarify requirements while considering solutions with the help of mentorship for complex projects.
Excellent problem-solving skills and attention to detail are required, with a high-level understanding of how downstream users leverage data.
Nice to have: Experience in Terraform or other infrastructure as code tools, understanding of data security governance best practices, experience with DBT, Snowflake, Meltano, and experience curating data and data pipelines for large language models.
Benefits:
You will receive a great 401(k) plan with a dollar-for-dollar employer match up to 4% of compensation, which is immediately vested, and $0 plan fees.
Top-of-the-line health plans, as well as dental and vision insurance, are provided.
Competitive time off and parental leave are included.
Additional wealth benefits include unlimited access to digital tools, financial professionals, and a knowledge center to support your financial wellness.
Enhanced mental health support for employees and dependents is available through Lyra.
Fertility healthcare and family forming benefits are offered through Carrot.
Candidly provides resources to help you and your family plan, borrow, and repay student debt.
A monthly work-from-home stipend and quarterly lifestyle stipend are included.
Engaging team-building experiences, ranging from virtual social events to team offsites, promote collaboration and camaraderie.