Remote Senior Big Data Hadoop ML Engineer (GCP) - Canada

Posted

Apply now
Please, let Rackspace know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • Develop scalable and robust code for large scale batch processing systems using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase.
  • Develop, manage, and maintain batch pipelines supporting Machine Learning workloads.
  • Leverage GCP for scalable big data processing and storage solutions.
  • Implement automation/DevOps best practices for CI/CD, IaC, etc.
  • Work remotely with excellent communication skills and the ability to solve complex problems independently and creatively.

Requirements:

  • Proficiency in the Hadoop ecosystem with Map Reduce, Oozie, Hive, Pig, HBase, Storm.
  • Strong programming skills in Java, Python, and Spark.
  • Knowledge of public cloud services, particularly in GCP.
  • Experience in Infrastructure and Applied DevOps principles, utilizing tools for CI/CD, IaC like Terraform.
  • Ability to tackle complex challenges, think critically, and propose innovative solutions.
  • Effective remote work experience with strong written and verbal communication skills.
  • Proven experience in engineering batch processing systems at scale.
  • Hands-on experience in public cloud platforms, particularly GCP.

Benefits:

  • Opportunity to work on developing batch processing systems and Machine Learning pipelines.
  • Remote position with the flexibility to work independently and creatively.
  • Utilize GCP for scalable big data processing and storage solutions.
  • Implement automation/DevOps best practices for CI/CD, IaC, etc.
  • Join a dynamic team at a company recognized as a best place to work, offering equal employment opportunities.
Apply now
Please, let Rackspace know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback