Remote Data Engineer

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Allata is an IT company focused on strategy, architecture, and enterprise-level application development with offices in the US, India, and Argentina.
  • The company aims to enhance business opportunities, create efficiencies, automate processes through custom technologies, and provide elegant solutions to inefficient problems.
  • They offer services including Data Analytics, Advanced Integrations, Product Launch, Experience Design, Support, Cloud, DevOps, and Software Development.
  • Allata is seeking an experienced Data Engineer with strong proficiency in Data Bricks and Delta Live Tables.
  • The successful candidate will elevate the client's data ecosystem and solutions by migrating existing data pipelines, crafting new data products for enterprise analytics, and implementing data quality standards.
  • Responsibilities include designing, building, and maintaining reusable data products, ensuring data accuracy and consistency, developing and improving data infrastructure, collaborating with stakeholders, building scalable data processing solutions, maintaining documentation, ensuring data security, troubleshooting data issues, and staying updated with emerging trends in data engineering.

Requirements:

  • Expert level knowledge of Data Bricks, Delta Live Tables, SQL, and Python for writing complex queries across large data volumes is required.
  • Experience in designing and building dimensional models is necessary.
  • Candidates must have expertise in building and maintaining data pipelines using tools such as Data Bricks, DLT, DBT, Matillion, AWS Glue, or Azure DataFactory.
  • Experience with the medallion architecture is essential.
  • Candidates should have expertise in cloud data lakes like Data Bricks Lakehouse, Azure Storage, or AWS S3.
  • Knowledge of cloud data warehouses such as Snowflake, Redshift, or BigQuery is required.
  • Understanding of batch and streaming data processing techniques is necessary.
  • Familiarity with the Data Lifecycle Management process is required.
  • Experience applying DevOps principles to data projects and familiarity with tools like Git, infrastructure as code, and CICD is essential.
  • Nice to have skills include knowledge of architectural best practices for building data lakes, experience with DBT, BI tools, big data technologies, and cloud platforms.

Benefits:

  • The position offers the opportunity to work in a dynamic and innovative IT company.
  • Employees will have the chance to collaborate with distributed teams and clients, enhancing their professional experience.
  • The role provides exposure to cutting-edge technologies and trends in data engineering.
  • Allata promotes a culture of self-motivation, organization, and attention to detail, fostering personal and professional growth.
  • Strong English communication skills are encouraged and developed within the team environment.
About the job
Posted on
Job type
Salary
-
Position
Experience level
Leave a feedback