Please, let AAPC know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
We are looking for a talented Data Engineer with expertise in AWS to join our data team.
In this role, you will design, build, and maintain robust data pipelines and architectures to support our data-driven decision-making processes.
You will work closely with data analysts and scientists to ensure data integrity and accessibility, optimizing data workflows in a cloud environment.
A strong focus on automated testing is essential to ensure the reliability and performance of our solutions.
Understand data architecture and overall business requirements around the data platform.
Collect key data requirements from product/business groups on a regular basis.
Accountable for data strategy on how the organization collects, processes, uses, governs, and stores data.
Own the data engineering life cycle including research, proof of concept, architecture, development, testing, deployment, and maintenance of the data platform.
Collect data through pipelines from various sources, including databases, APIs, partners, and external providers.
Standardize data by converting different data formats from sources to a standard format and cleanse it by removing inconsistencies.
Design and develop ETL or ELT processes, cleansing, aggregation, enrichment, and making data usable for analytics.
Make consumable data available on high-throughput data stores and build standard data consumption patterns (APIs, flat file, Database tables, etc.) for users and systems to fetch data at scale.
Add instrumentation across data pipelines for monitoring and alerting on internal problems before they result in user-visible outages.
Build processes and diagnostic tools to troubleshoot, maintain, and optimize data processes and respond to customer and production incidents.
Implement end-to-end automated CICD practices for building, scanning, packaging, testing, and deploying on a data platform, with the ability to continuously deliver secure solutions at scale.
Adopt continuous learning of modern data engineering practices and maintain industry standards through incremental adoption of new technology and best practices.
Create and continuously maintain high-quality, up-to-date documentation of data processes, data dictionaries, and transformations to meet audit and compliance standards.
Requirements:
Experience working in AWS/Public Cloud-based infrastructure is required.
A basic understanding and hands-on skills in Terraform/IaC are necessary.
Strong knowledge and experience in data pipelines through modern cloud computing tools such as Databricks, Glue, and Lambda are essential.
Hands-on experience in data storage and warehousing through cloud storage solutions like S3, managed warehousing, Postgres, RDS, and Redshift is required.
Hands-on experience in data analysis, coding, and testing using Python, Spark, Scala, and Shell is necessary.
Experience creating data architecture diagrams, dictionaries, documentation, and test plans is required.
Experience with basic data monitoring and alerting patterns and tools is necessary.
Benefits:
Compensation is commensurate with experience.
A comprehensive benefits package including medical, dental, and vision insurance is provided.
A Health Savings Account is available.
Generous PTO and Holiday Pay are offered.
A 401(k) retirement plan is included.
Remote work-from-home option consideration is available.
Apply now
Please, let AAPC know you found this job
on RemoteYeah
.
This helps us grow 🌱.