Remote Sr Big Data Engineer (GCP)- Airflow and Oozie
Posted
Apply now
Please, let Rackspace know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Seeking a highly skilled and experienced Senior Big Data Engineer to join a dynamic team.
Strong background in developing and scaling both stream and batch processing systems.
Solid understanding of public cloud technologies, especially GCP.
Role involves working in a remote environment, requiring excellent communication skills and problem-solving abilities.
Responsibilities include building reusable and reliable code for stream and batch processing systems using technologies like Pub/Sub, Kafka, DataFlow, Flink, Hadoop, Pig, Hive, and Spark.
Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc.
Requirements:
Experience with GCP managed services and cloud-based batch processing systems.
Proficiency in Oozie, Airflow, Map Reduce, Java, Python, Pig, and SQL.
Expertise in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce.
Familiarity with BigTable and Redis.
Experience in Infrastructure and Applied DevOps principles, utilizing tools like Terraform for CI/CD and IaC.
Ability to tackle complex challenges, think critically, and propose innovative solutions.
Strong written and verbal communication skills for remote collaboration.
Proven experience in engineering batch processing systems at scale.
Hands-on experience in public cloud platforms, particularly GCP.
Benefits:
Opportunity to work with a dynamic team on developing and scaling processing systems.
Remote work environment with a focus on communication and problem-solving.
Exposure to a variety of cutting-edge technologies and cloud services.
Implementation of automation and DevOps best practices for efficient workflows.
Chance to contribute to the development of scalable big data solutions.
Apply now
Please, let Rackspace know you found this job
on RemoteYeah
.
This helps us grow 🌱.