Remote Senior Data Engineer

Posted

Apply now
Please, let Netomi know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • Netomi AI is seeking a Senior Data Engineer to join their team, which focuses on creating artificial intelligence that enhances customer experiences for major global brands.
  • The role is remote and full-time, allowing for collaboration with top-tier clients and the opportunity to build a professional network.
  • The company is backed by prominent investors, providing a chance to be part of a visionary group shaping the future of AI.
  • Responsibilities include architecting and implementing scalable, secure, and reliable data pipelines using modern data platforms such as Spark, Databricks, Airflow, and Snowflake.
  • The Senior Data Engineer will develop ETL/ELT processes to ingest data from various structured and unstructured sources and perform Exploratory Data Analysis (EDA) to derive insights for data product development.
  • Collaboration with data scientists, analysts, and software engineers is essential to design data models that support high-quality analytics and real-time insights.
  • The role involves leading data infrastructure projects, managing data on cloud platforms, and ensuring data governance, security, and compliance.
  • The engineer will monitor and optimize data system performance, mentor junior engineers, and stay updated on emerging technologies in data engineering.

Requirements:

  • A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field is required.
  • Candidates must have 8+ years of hands-on experience in data engineering or backend software development roles.
  • Proficiency in Python, SQL, and at least one data pipeline orchestration tool such as Apache Airflow, Luigi, or Prefect is necessary.
  • Strong experience with cloud-based data platforms like AWS Redshift, GCP BigQuery, Snowflake, or Databricks is essential.
  • A deep understanding of data modeling, data warehousing, and distributed systems is required.
  • Experience with big data technologies such as Apache Spark, Kafka, and Hadoop is necessary.
  • Familiarity with DevOps practices, including CI/CD, infrastructure as code, and containerization with Docker/Kubernetes is expected.

Benefits:

  • Working at Netomi AI offers the opportunity to significantly impact the company’s success while developing skills and advancing a career in AI.
  • Employees will be part of a dynamic and fast-growing team that values innovation, creativity, and hard work.
  • The company promotes a culture of diversity and is an equal opportunity employer, ensuring a workplace free from discrimination based on race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
Apply now
Please, let Netomi know you found this job on RemoteYeah . This helps us grow 🌱.
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback