Remote Data Engineer

Posted

Apply now
Please, let Mercari know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • The Data Engineer will design, build, and operate ETL pipelines at scale.
  • Responsibilities include automating processes related to data products and machine learning products.
  • The role involves designing data structures for data products and building knowledge graphs, flow charts, and system diagrams for problem analysis.
  • The engineer will develop and operate APIs/tools related to data products and machine learning products.
  • Providing technical solutions using Big Data technologies and creating technical design documents is essential.
  • The position requires designing and developing a Data Platform using Python, Spark, and BigQuery.
  • The engineer will build a DevOps platform for Continuous Integration/Continuous Deployment (CI/CD) stack for ETL applications teams.
  • Profiling, debugging, and optimizing applications are also part of the job.
  • Remote work is permitted within the U.S. only.
  • The position requires a commitment of 40 hours per week, Monday through Friday.

Requirements:

  • A Bachelor of Science degree in Computer Science or a closely related field of study is required.
  • Candidates must have five (5) years of experience as a Data Engineer, Data Specialist, or in a related occupation where the required experience was gained.
  • Special skills required include five (5) years of experience in Confluence, JIRA, GIT, CI/CD pipelines, and Agile methodologies.
  • Proficiency in Java and Python is necessary.
  • Experience in Data Modeling or Data Warehouse is required.
  • Candidates must have experience with ETL tools such as Apache Airflow.
  • Knowledge of container technologies like Docker or Kubernetes is essential.
  • Experience with APIs such as gRPC, Tensorflow Serving, or Flask (REST) is required.
  • Candidates should be familiar with databases including MySQL, Postgres, Oracle, SqlServer, or Google Spanner.
  • Experience in distributed processing using Apache Beam or Apache Spark is necessary.
  • Knowledge of machine learning frameworks such as Tensorflow, Keras, or Scikit-Learn is required.
  • Familiarity with Google Cloud services, including BigQuery, Google Dataflow, or Google Dataproc, is essential.

Benefits:

  • Employees enjoy the flexibility to work remotely from anywhere in the U.S. and receive flexible time off.
  • The company offers top-notch insurance plans, best-in-class new parent offerings, and access to mind and body wellness apps.
  • As the company grows, career opportunities for employees also expand, with access to new tools, technologies, and learning opportunities.
  • The total rewards package provides a strong financial foundation and benefits that go beyond the paycheck.
  • The company fosters a culture of teamwork, celebrating each other's successes through virtual coffee breaks and recognition programs.
Apply now
Please, let Mercari know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
$ 201,571 - 240,000 USD / year
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback