Remote Software Engineer - Data Platform

Posted

Apply now
Please, let One know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • One is on a mission to help customers achieve financial progress by creating simple solutions for saving, spending, borrowing, and growing money in one place.
  • The company aims to address the needs of millions of Americans who are unbanked or underbanked and struggle with managing their finances through multiple apps.
  • The Software Engineer for the Data Platform team will design, develop, and maintain data infrastructure, including data pipelines and data warehouses.
  • This role is crucial for driving data strategy, enabling advanced analytics, and supporting Machine Learning operations at scale.
  • Responsibilities include designing and maintaining robust streaming and batch data pipelines using Databricks, Spark, and Python.
  • The engineer will ensure the reliability, security, and scalability of data infrastructure while adhering to industry regulations and data governance standards.
  • Monitoring and troubleshooting data infrastructure to ensure high availability and performance is essential.
  • The role involves building and optimizing data platforms and warehouses to meet the needs of stakeholders across analytics, machine learning, and backend systems.
  • The engineer will assist in re-architecting batch pipelines to streaming pipelines for real-time data flow.
  • Collaboration with data scientists and analysts is necessary to streamline data processing for advanced analytics and machine learning.
  • Establishing and maintaining MLOps workflows for seamless deployment, monitoring, and serving of ML models and features is required.
  • The engineer will drive feature engineering and create systems to stage and serve features for machine learning.

Requirements:

  • Candidates should have 3-7 years of experience in software engineering for a data platform or a similar role.
  • Expertise in Apache Spark for large-scale data processing is required.
  • Advanced knowledge of production-level Python is necessary.
  • Strong SQL skills for data manipulation and ETL processes are essential.
  • Experience with real-time streaming technologies such as Kafka or Kinesis is required.
  • Familiarity with MLOps practices and workflows, including feature engineering, model training, and serving, is necessary.
  • Proficiency in Databricks for managing data pipelines and analytics workflows, as well as Infrastructure as Code (IaC) using Terraform or AWS, is preferred.
  • Strong problem-solving skills and the ability to work collaboratively in cross-functional teams are essential.
  • An “act-like-an-owner” mentality with a bias toward taking action is expected.

Benefits:

  • The position offers a competitive cash salary.
  • Benefits are effective on day one of employment.
  • Employees gain early access to a high potential, high growth fintech environment.
  • Generous stock option packages are available in this early-stage startup.
  • The company is remote-friendly, allowing employees to choose their work schedule.
  • Flexible time off programs include vacation, sick leave, paid parental leave, and paid caregiver leave.
  • A 401(k) plan with a match is provided to employees.
Apply now
Please, let One know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
$ 175,000 - 220,000 USD / year
Location requirements
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback