This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Design, develop, and maintain scalable data pipelines and ETL processes on Google Cloud Platform (GCP).
Strong programming experience using Python.
Utilize BigQuery for data warehousing solutions and ensure efficient data querying and storage.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions.
Implement data models and data storage solutions on GCP that ensure data integrity, performance, and reliability.
Optimize and tune data processing systems for performance and scalability on GCP.
Ensure the security and privacy of data through GCP’s data governance practices.
Troubleshoot and resolve data-related issues, ensuring data quality and consistency.
Document data architecture, processes, and systems for cross-functional team use.
Stay current with emerging trends and technologies in data engineering and integrate them into our data infrastructure on GCP.
Requirements:
Bachelor’s degree in Computer Science, Information Technology, or a related field.
Proven experience as a Senior Data Engineer with 7-8 years of experience, specifically with Google Cloud Platform.
Proficiency in SQL and experience with BigQuery.
Experience with GCP services such as Dataflow, Dataproc, Pub/Sub, Cloud Storage, and Cloud Functions.
Proficiency in programming language majorly Python.
Strong problem-solving skills and attention to detail.
Excellent communication and collaboration skills.
Benefits:
Competitive salary based on experience and skills.
Health insurance and other benefits package.
Opportunity to work with a dynamic and innovative team.
Professional development opportunities to stay updated with the latest technologies.
Chance to contribute to strategic initiatives using Google Cloud Platform.