Remote Sr Big Data Engineer Airflow and Oozie (GCP)
Posted
This job is closed
This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Develop scalable and robust code for batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, and Hbase.
Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem.
Utilize GCP for scalable big data processing and storage solutions.
Implement automation and DevOps best practices for CI/CD, IaC, etc.
Work in a remote environment, requiring strong communication skills and the ability to solve complex problems independently.
Requirements:
Bachelor's degree in Computer Science, software engineering, or related field.
Experience with GCP managed services and cloud-based batch processing systems.
Proficiency in Oozie, Airflow, Map Reduce, Java, Spark (Java), Python, Pig, and SQL.
Expertise in public cloud services, especially GCP.
Strong knowledge of the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce.
Familiarity with BigTable and Redis.
Experience with Infrastructure and Applied DevOps principles, CI/CD, IaC using tools like Terraform.
Ability to tackle complex challenges, think critically, and propose innovative solutions.
Proven experience in engineering batch processing systems at scale.
Hands-on experience in public cloud platforms, particularly GCP.
Benefits:
Opportunity to work with a dynamic team in a remote setting.
Competitive salary and benefits package.
Chance to work with cutting-edge technologies in the big data space.
Collaborative work environment with opportunities for growth and development.
Joining a company recognized as a best place to work, committed to diversity and equal employment opportunity.