Remote Sr Big Data Engineer Airflow and Oozie (GCP)- US Remote
Posted
This job is closed
This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
The Senior Big Data Engineer will be responsible for developing scalable and robust code for batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, and Hbase.
They will manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem.
Leveraging GCP for scalable big data processing and storage solutions will be a key part of the role.
Implementing automation/DevOps best practices for CI/CD, IaC, etc., will also be a part of the responsibilities.
The role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
Requirements:
Experience with GCP managed services and understanding of cloud-based batch processing systems are critical.
Proficiency in Oozie, Airflow, Map Reduce, Java, Spark, Python, Pig, and SQL.
Expertise in public cloud services, particularly in GCP, and the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce.
Familiarity with BigTable and Redis.
Experience in Infrastructure and Applied DevOps principles, utilizing tools for CI/CD, IaC like Terraform.
Ability to tackle complex challenges, propose innovative solutions, and work effectively in a remote setting with strong communication skills.
Proven experience in engineering batch processing systems at scale and hands-on experience in public cloud platforms, particularly GCP.
Benefits:
Opportunity to work with cutting-edge technologies in big data and cloud computing.
Remote work environment with a dynamic and collaborative team.
Competitive salary and benefits package.
Opportunity for professional growth and development in a leading technology company.