Remote Senior Data Engineer (TV Platform)

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Sigma Software is seeking an experienced Senior Data Engineer to join their engineering team.
  • The role involves working with a team of data engineers to solve challenging problems using advanced data tools in the cloud.
  • The client provides technology and advisory services to major media and entertainment companies, including Fox, NBC Universal, and Viacom.
  • The project focuses on building and supporting high-quality data solutions to process large volumes of data on AWS’s cloud-native data platform.
  • Responsibilities include building and maintaining ETL pipelines, modifying application code, analyzing requirements, and supporting design, coding, testing, debugging, deployment, and maintenance of programs.
  • Documentation of work is essential.
  • The role also involves developing databases, data collection systems, and data analytics strategies to optimize efficiency and quality.
  • Conducting code and design reviews to ensure product quality is required.
  • Mentoring colleagues and new team members is part of the job.
  • Collaboration with Data engineers and Product Managers to prioritize business needs and translate complex requirements into high-quality cloud-native solutions is expected.
  • Sharing knowledge with wider engineering teams through technical demos is encouraged.

Requirements:

  • Candidates must have 5+ years of hands-on experience in Software Development and/or Big Data.
  • Excellent knowledge of Python is required.
  • Proficiency in PySpark and a strong understanding of Spark is necessary.
  • Understanding of ETL frameworks such as DBT is essential.
  • Experience with orchestration tools like Airflow is required.
  • Knowledge of Lakehouse architecture on AWS, including Apache Iceberg, Hudi, or Delta Lake, is necessary.
  • A strong understanding of AWS services, including IAM, S3, and Security groups, is required.
  • Proficiency in Infrastructure as Code tools such as Terraform or similar is necessary.
  • Candidates must possess great communication skills to articulate status, blockers, and design clearly.
  • The ability to work independently and collaboratively is essential.
  • An Upper-Intermediate level of English is required.
  • Experience with modern Lakehouse aspects like Databricks Unity Catalog and Snowflake Iceberg integration will be a plus.

Benefits:

  • The position offers the opportunity to work remotely.
  • Employees will be part of a tightly knit team solving complex data engineering challenges.
  • The role provides the chance to work with cutting-edge data collection, transformation, analysis, and monitoring tools.
  • There is an opportunity for professional growth through mentoring and collaboration with experienced colleagues.
  • Employees will have the chance to contribute to high-quality data solutions for major media and entertainment companies.
About the job
Posted on
Job type
Salary
-
Position
Experience level
Leave a feedback