This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
Evooq is seeking a talented Data Engineer to join their team and contribute to building their data platform.
The position is 100% remote, following Singapore hours, and candidates must be in a similar timezone.
Responsibilities include designing, developing, and maintaining ELT pipelines for data ingestion, processing, and storage.
Tools used include Dagster for orchestration, Starburst Galaxy as a query engine, dbt for transformations, and Amazon S3 + Iceberg for an open data lakehouse.
The Data Engineer will centralize data from diverse sources, collaborate with data science and analytics teams, ensure data quality, and optimize data management systems.
Participation in the automation of data processes and CI/CD of data workflows is required.
Requirements:
Hands-on experience developing data pipelines using modern data stack and data lakehouse.
Proficiency in SQL and Python.
Knowledge of open table formats like Apache Iceberg and file formats such as Parquet.
Familiarity with relational and non-relational database management systems.
Experience with dbt and cloud-based infrastructure, particularly AWS.
Hands-on experience with object storage systems like S3.
Understanding of data governance, data lineage, access, and security principles.
Nice to have: Experience with Apache Spark, Hadoop, orchestration tools like Airflow, Trino-based query engines, data visualization tools, and DevOps practices.
Benefits:
Opportunity to work remotely following Singapore hours.
Chance to contribute to building a data platform using cutting-edge technologies.
Collaborate with highly autonomous teams in a flat organizational structure.
Work on solutions that combine data, technology, and investment expertise.
Join a company that values trust, responsibility, and innovation.