Remote Lead Data Engineer

Posted

Apply now
Please, let Hakkoda know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • We are seeking a Lead Data Engineer to join our growing team of experts.
  • In this role, you will be responsible for designing and developing solutions within the Snowflake Data Cloud for international clients across industries.
  • The ideal candidate must be a skilled data pipeline developer with experience in migration and a strong interest in optimizing data systems and building them from the ground up.
  • The Lead Data Engineer will focus on developing database architectures and managing data warehouses to ensure the consistent application of optimal data delivery frameworks across customer projects.
  • Your responsibilities include overseeing data ingestion pipelines, designing scalable data architectures, and ensuring strong data governance and security practices.
  • In addition, you will mentor junior engineers, guiding them in the design and implementation of innovative data solutions.
  • We are seeking someone who is enthusiastic about joining a growing global company and helping shape the future of our customers' next-generation data initiatives.

Requirements:

  • A Bachelor’s degree in engineering, computer science, or a related field is required.
  • Proven expertise in evaluating, selecting, and integrating data ingestion technologies to tackle complex data challenges is necessary.
  • Expertise in making architectural decisions for high-throughput data ingestion frameworks, including real-time data processing and analytics, is essential.
  • Experience mentoring junior engineers in best practices for data ingestion, performance optimization, and troubleshooting is required.
  • A minimum of 5 years of experience in related technical roles, including data management, database development, ETL, data warehouses, and data pipelines, is needed.
  • Hands-on experience in designing and developing data warehouses (e.g., Teradata, Oracle Exadata, Netezza, SQL Server, Spark) is necessary.
  • Proficiency in building ETL/ELT ingestion pipelines using tools such as DataStage, Informatica, or Matillion is required.
  • Expertise in SQL scripting is essential.
  • Cloud experience with AWS is required, while Azure and GCP experience are also beneficial.
  • Proficiency in Python scripting is necessary.
  • The ability to prepare reports and deliver presentations to internal and external stakeholders is required.
  • Proven problem-solving skills with a proactive, action-oriented mindset are essential.
  • Strong interpersonal skills, including assertiveness and the ability to build solid client relationships, are necessary.
  • Experience working in Agile teams is required.
  • Proficiency in English, both written and spoken, is essential.

Benefits:

  • We offer comprehensive health insurance to all employees.
  • A competitive meal allowance is provided.
  • Annual bonus opportunities are available.
  • Employees receive 22 days of paid time off, plus 2 additional days (your birthday and Christmas Eve).
  • An initial home office budget is provided to help set up your workspace.
  • A work from home allowance is included to support remote work.
  • We offer technical training and certification programs to encourage professional development.
Apply now
Please, let Hakkoda know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Position
Experience level
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback