Remote Data Engineer

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • As a data engineer, you will be responsible for designing, developing, and maintaining One's data infrastructure, including data pipelines and data warehouses.
  • This role will impact One’s vision by enabling scalable, high-performance data architecture that empowers data-driven decision-making across the organization.
  • You will work closely with the data science, business analysts, and software engineering teams to ensure our data architecture meets the needs of our business.
  • Responsibilities include designing, implementing, and maintaining robust data pipelines and workflows using Databricks, Spark, and Python.
  • You will build and optimize data warehouses to meet the evolving needs of business stakeholders.
  • Collaborating with data scientists to streamline data processing for advanced analytics and machine learning is essential.
  • You will work closely with software engineers to align data architecture with the overall system architecture and engineering practices.
  • Ensuring data infrastructure is reliable, secure, and compliant with relevant industry regulations and data governance standards is a key responsibility.
  • You will monitor, troubleshoot, and proactively identify improvements in the data infrastructure to ensure high availability and performance.

Requirements:

  • You must have 5+ years of experience in data engineering or a similar role.
  • Proficiency in Databricks for managing data pipelines and analytics workflows is required.
  • Strong experience with Apache Spark for large-scale data processing is necessary.
  • Advanced knowledge of SQL for data manipulation, querying, and creating ETL processes is essential.
  • Expertise in Python for scripting, automation, and building data workflows is required.
  • Strong problem-solving and analytical skills are necessary for this position.
  • You should have the ability to work collaboratively with cross-functional teams.
  • Familiarity with real-time data streaming technologies such as Kafka and Kinesis is preferred.
  • Experience with Infrastructure as Code (IaC) for infrastructure and pipeline implementations is required.
  • Experience with cloud services, preferably AWS, is necessary.
  • An act-like-an-owner mentality with a bias toward taking action is essential.

Benefits:

  • Competitive cash compensation is offered for this position.
  • Benefits are effective on day one of employment.
  • Employees gain early access to a high potential, high growth fintech environment.
  • Generous stock option packages are available in this early-stage startup.
  • The position is remote-friendly (anywhere in the US) and office-friendly, allowing you to choose your schedule.
  • Flexible time off programs are provided, including vacation, sick leave, paid parental leave, and paid caregiver leave.
  • A 401(k) plan with a match is included as part of the benefits package.
About the job
Posted on
Job type
Salary
$ 135,000 - 175,000 USD / year
Position
Experience level
Leave a feedback