We are seeking a highly motivated and detail-oriented Snowflake Data Engineer to join our data engineering team.
The ideal candidate will have a strong background in Snowflake, DBT, ETL processes, and modern Lakehouse architecture, along with hands-on experience in SnapLogic and Python.
You will be responsible for designing, developing, and optimizing data pipelines that support business intelligence, analytics, and reporting needs across the organization.
This role requires a blend of technical expertise, problem-solving skills, and the ability to work collaboratively in a fast-paced environment.
Key responsibilities include designing and optimizing scalable data models and data warehouses using Snowflake, building and maintaining ETL/ELT pipelines with SnapLogic, DBT, and Python, contributing to the development of a modern Lakehouse architecture, monitoring and optimizing performance, developing reusable frameworks for automation, implementing data quality and governance measures, collaborating with data scientists and business stakeholders, and maintaining documentation and best practices.
Requirements:
A minimum of 3 to 5 years of professional experience in data engineering or data warehousing roles is required.
Strong expertise in Snowflake, including performance tuning, query optimization, and warehouse management is essential.
Proficiency in DBT (Data Build Tool) for modeling, transformations, and version-controlled data workflows is necessary.
Hands-on experience with ETL/ELT pipeline design using SnapLogic or similar tools is required.
A solid understanding of Lakehouse architecture and modern data platform concepts is needed.
Strong programming skills in Python for data transformation, automation, and API integrations are mandatory.
Knowledge of SQL and advanced query writing for analytics and reporting is required.
Familiarity with data governance, data security, and compliance frameworks is necessary.
Strong problem-solving skills with the ability to analyze complex data challenges and propose effective solutions are essential.
Excellent communication and collaboration skills, with the ability to work in cross-functional teams, are required.
Benefits:
This is a full-time position with the flexibility of remote work from India.
The role offers the opportunity to work with cutting-edge technologies in data engineering and analytics.
You will be part of a collaborative team that values innovation and problem-solving.
The position provides a chance to contribute to the development of modern data architectures and practices.
Opportunities for professional growth and development in a fast-paced environment are available.