Please, let Rackspace know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Seeking a highly skilled and experienced Senior Big Data Engineer with a strong background in developing and scaling stream and batch processing systems.
Must have a solid understanding of public cloud technologies, especially GCP.
Role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
Responsibilities include building reusable and reliable code for stream and batch processing systems at scale using technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark.
Implement automation/DevOps best practices for CI/CD, IaC, and Containerization.
Requirements:
Experience in building reusable and reliable code for stream and batch processing systems at scale.
Expertise in public cloud services, particularly in GCP.
Familiarity with GCP managed services like Data Proc, Cloud Composer, GCS, and DataFlow.
Proficiency in Infrastructure and Applied DevOps principles, utilizing tools like Terraform for CI/CD and IaC.
Knowledge of containerization technologies such as Docker and Kubernetes.
Ability to tackle complex challenges, think critically, and propose innovative solutions.
Strong programming skills in Java and Python.
Hands-on experience in engineering stream/batch processing systems at scale.
Effective communication skills for remote work collaboration.
Benefits:
Opportunity to work with cutting-edge technologies in a dynamic team environment.
Remote work setup with a focus on communication and problem-solving skills.
Chance to contribute to the development and scaling of stream and batch processing systems.
Competitive salary and benefits package.
Joining a company recognized for its expertise in multicloud solutions and commitment to employee development.
Apply now
Please, let Rackspace know you found this job
on RemoteYeah
.
This helps us grow 🌱.