This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
The Data Engineer will collaborate with Data & Technical Architects, integration, and engineering teams to capture requirements and develop data pipeline solutions.
They will support the evaluation and implementation of data technologies to meet evolving needs.
Responsibilities include working with IT business engagement, applications engineers, and data partners to identify and define data source requirements and ensure data quality.
The Data Engineer will develop data preparation tools, build data pipelines for integration into Zscaler's Snowflake data warehouse, and establish data management standards in collaboration with Principal Engineers.
Requirements:
Over 3 years of experience in data warehouse design and development.
Proficient in building data pipelines to integrate business applications (e.g., Salesforce, Netsuite, Google Analytics) with Snowflake.
Proficiency in SQL, data modeling techniques (Dimensional), and writing efficient queries on large datasets.
Hands-on experience with Python for extracting data from APIs and building data pipelines; experienced in ELT tools like Matillion, Fivetran, Talend, and IDMC.
Experienced in data transformation tools like DBT, AWS services (EC2, S3, Lambda), CI/CD processes, Git versioning, and advanced Snowflake concepts, with knowledge of data visualization tools like Tableau and Power BI.
Benefits:
Remote work options available.
Comprehensive benefits program including various health plans, time off plans for vacation and sick time, parental leave options, retirement options, education reimbursement, and in-office perks.
Salary range for the full-time position is $98,000β$140,000 USD, excluding commission/bonus/equity (if applicable) + benefits.
Zscaler is an equal opportunity and affirmative action employer, committed to creating an inclusive environment for all employees.