Welcome to RemoteYeah 2.0! Find out more about the new version here.

Remote Data Model Engineer

at Intellectsoft

Posted 8 hours ago 0 applied

Description:

  • Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia.
  • This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.
  • You will work closely with Data Architects, Data Engineers, Business Analysts, and DevOps Engineers to design and implement scalable data solutions.
  • Responsibilities include designing and implementing conceptual, logical, and physical data models that support enterprise reporting, analytics, and operational processes.
  • You will develop and maintain data dictionaries, entity-relationship (ER) diagrams, and metadata management solutions.
  • Collaborate with data engineers to optimize data structures and ensure proper integration with data pipelines.
  • Translate complex business requirements into scalable and efficient data models.
  • Enforce data modeling best practices, including normalization, denormalization, indexing, and partitioning strategies.
  • Ensure data models are aligned with master data and data governance standards.
  • Conduct model reviews and provide documentation and training to technical and non-technical stakeholders.
  • Participate in performance tuning and troubleshooting of large-scale data models and queries.

Requirements:

  • A Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field is required.
  • You must have 3+ years of experience in data modeling, data architecture, or data engineering roles.
  • Proficiency with data modeling tools such as ER/Studio, ERwin, dbt, and SQL Developer Data Modeler is necessary.
  • Strong SQL skills and experience with relational and dimensional database modeling, including star and snowflake schemas, are essential.
  • Experience with modern data warehouses like Snowflake, Redshift, or Yellowbrick and ETL/ELT tools is required.
  • An understanding of data governance, metadata management, and data cataloging tools is important.
  • Experience collaborating in Agile/Scrum teams and working with version-controlled data models via Git is needed.
  • Nice to have skills include experience working with data lakehouse architectures or modeling for big data environments, familiarity with business intelligence tools, knowledge of data quality frameworks, and experience in regulated industries.

Benefits:

  • You will receive 35 absence days per year for work-life balance.
  • Access to Udemy courses of your choice will be provided.
  • English courses with a native speaker will be available.
  • Regular soft-skills training sessions will be offered.
  • You will have opportunities to participate in Excellence Centers meetups.
  • Online and offline team-building activities will be organized.
  • Business trips may be part of the role.