Remote Data Engineer

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Architect and develop efficient, scalable data pipelines using modern ETL frameworks to transform data from diverse sources into centralized data warehouses or data lakes.
  • Collaborate with clients to understand and translate their business needs into technical requirements for data infrastructure solutions.
  • Design, build, and maintain cloud-based data platforms (Azure, AWS, and/or GCP) that enable seamless integration of structured, semi-structured, and unstructured data.
  • Implement data governance best practices and ensure data quality, security, and compliance with relevant regulations.
  • Build and optimize data models (dimensional, relational, and NoSQL) that support both batch and real-time analytics needs.
  • Work with BI developers and data scientists to deliver comprehensive data solutions, including dashboards, machine learning models, and AI-driven insights.
  • Utilize modern software development practices such as CI/CD pipelines, source code management, and agile methodologies to deliver robust and reliable data solutions.

Requirements:

  • 5+ years of experience in data engineering, with hands-on experience in cloud data services, particularly in Azure, AWS and/or GCP.
  • Expertise in developing scalable ETL pipelines using tools like Azure Data Factory, AWS Glue, Databricks, or Snowflake.
  • Experience designing and implementing data lakes and data warehouses to support diverse analytics workloads.
  • Practical experience with BI tools such as Power BI, Tableau, or Looker, and familiarity with modern data visualization principles.
  • Strong programming skills in Python, with familiarity in SQL, Spark, or PySpark to handle large datasets.
  • Understanding of data governance, data quality frameworks, and the ability to implement security and privacy controls around sensitive data.
  • Experience with DevOps practices such as CI/CD, automation, and infrastructure as code (e.g., Terraform, Ansible).
  • Familiarity with AI/ML technologies and their application in data engineering projects is a plus.
  • Proven ability to work with business stakeholders to define data-related business needs and deliver value-driven solutions.
  • Experience working with SAP Hana is strongly desired.

Benefits:

  • Professional Development budget and time
  • Career Mentor to help you grow in your career
  • RRSP/401K match program
  • Bonus programs to reward you for your accomplishments
  • Wellness program to keep you healthy
  • Opportunities to connect – book clubs, games nights, Special Interest Groups, “Coffee & Code” for our developer friends, Team Meetings, and much more
About the job
Posted on
Job type
Salary
-
Leave a feedback