Remote Cloud Data Engineer (AWS) (Remote, Canada)

Posted

Apply now
Please, let Collective[i] know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • Collective[i] is seeking a Senior Data Engineer with a strong background in AWS DevOps and data engineering to manage and optimize their data infrastructure.
  • The role involves deploying machine learning models to AWS using SageMaker, requiring expertise in AWS and SageMaker.
  • Responsibilities include designing, developing, and maintaining ETL pipelines to ensure reliable data flow and high-quality data for analytics and reporting.
  • The engineer will build and optimize data models in Snowflake, create and maintain complex SQL queries, and conduct orchestration and scheduling through Apache Airflow.
  • Documentation of data pipelines, architecture, and processes is essential, ensuring clear and updated technical documentation.
  • The engineer will architect, build, and maintain data science data and models infrastructure on AWS, focusing on scalability, performance, and cost-efficiency.
  • Collaboration with Data Scientists to deploy machine learning models on AWS SageMaker is a key aspect of the role.
  • The position also involves automating deployment and monitoring of ML models using CI/CD pipelines and infrastructure-as-code tools like Terraform or AWS CloudFormation.
  • AWS specific tasks include managing EC2, S3, RDS, VPC, CloudFormation, AutoScaling, CodePipeline, CodeBuild, CodeDeploy, and ECS/EKS.
  • Setting up and managing monitoring solutions such as CloudWatch to ensure effective operation of data pipelines and deployed models is required.

Requirements:

  • A Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field is required.
  • Candidates must have 5+ years of experience in Data Engineering, with at least 3+ years working in AWS environments.
  • Strong knowledge of AWS services, specifically SageMaker, Lambda, Glue, and Redshift, is essential.
  • Hands-on experience deploying machine learning models in AWS SageMaker is required.
  • Proficiency in DevOps practices, including CI/CD pipelines, containerization (Docker, ECS, EKS), and infrastructure-as-code tools like Terraform or CloudFormation is necessary.
  • Advanced SQL skills and experience in building and maintaining complex ETL workflows are required.
  • Proficiency in Python is essential, with additional skills in Java or Scala being a plus.
  • Practical experience with Airflow for DAG management and data orchestration is necessary.
  • Candidates must be proficient in version control (GIT) and containerized deployment with Docker and managed services such as AWS Fargate, ECS, or EKS.
  • Effective communication and a result-oriented approach are required.

Benefits:

  • The salary for this position ranges from $100,000 to $170,000 a year, reflecting the diverse and complex nature of the job market.
  • Collective[i] offers a fully remote work environment, allowing employees to work from wherever they choose.
  • The company values diversity of experience and backgrounds, fostering a culture of learning and growth alongside a talented team.
  • Employees have the opportunity to work with cutting-edge technology and contribute to a mission-driven organization focused on helping people and companies prosper.
Apply now
Please, let Collective[i] know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
$ 100,000 - 170,000 USD / year
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback