Welcome to RemoteYeah 2.0! Find out more about the new version here.

Remote Data Engineer

at Intellectsoft

Posted 1 day ago 0 applied

Description:

  • Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia.
  • This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.
  • You will work closely with Data Architects, Business Analysts, DataOps Engineers, and DevOps Engineers to design and implement scalable data solutions.
  • Responsibilities include designing, building, and maintaining scalable and reliable data pipelines for ingesting data from various sources, working with structured, semi-structured, and unstructured data, and ensuring data quality, consistency, and integrity.
  • You will develop and maintain ETL/ELT processes to support real-time and batch analytics, collaborate with Data Architects to design optimal data models and storage structures for analytics workloads, and implement data validation, deduplication, and transformation logic.
  • Contribute to the definition of data governance, security, and access policies, participate in platform scaling and performance optimization initiatives, and work closely with business and analytics teams to understand data needs and translate them into technical solutions.

Requirements:

  • You must have 5+ years of experience in data engineering or backend data development.
  • A strong knowledge of data pipeline design, integration frameworks, and ETL tools is required.
  • Experience working with cloud or hybrid data architectures is necessary.
  • Proficiency in SQL and at least one programming language (e.g., Python, Scala) is essential.
  • Hands-on experience with distributed data processing (e.g., Spark, Flink) is a plus.
  • Familiarity with data lake, data warehouse, or lakehouse architectures is expected.
  • Experience with real-time data streaming and ingestion frameworks is a strong advantage.
  • You should have an understanding of data security, privacy, and compliance best practices.
  • Experience working in Agile/Scrum environments is required.
  • Nice to have skills include experience with modern open-source tools (e.g., Airflow, dbt, Delta Lake, Apache Kafka), exposure to machine learning pipelines or working alongside ML teams, familiarity with BI tools and data visualization concepts, and experience working in regulated industries (e.g., gaming, finance, hospitality).

Benefits:

  • You will receive 35 absence days per year for work-life balance.
  • Udemy courses of your choice will be provided.
  • English courses with a native speaker will be available.
  • Regular soft-skills training sessions will be offered.
  • You will have the opportunity to participate in Excellence Centers meetups.
  • Online/offline team-building activities will be organized.
  • Business trips may be part of the role.