This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
As a Data Platform Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure, including data pipelines and data warehouses.
This role will enable scalable, high-performance data architecture that supports data-driven decision-making across the organization.
You will collaborate closely with data science, business analysts, and software engineering teams to ensure the data architecture meets business needs.
Responsibilities include designing, implementing, and maintaining robust data pipelines and workflows using Databricks, Spark, and Python.
You will build and optimize data warehouses to adapt to the evolving needs of business stakeholders.
The role involves collaborating with data scientists to streamline data processing for advanced analytics and machine learning.
You will work with software engineers to align data architecture with overall system architecture and engineering practices.
Ensuring the reliability, security, and compliance of the data infrastructure with industry regulations and data governance standards is essential.
You will monitor, troubleshoot, and proactively identify improvements in the data infrastructure to ensure high availability and performance.
Requirements:
You must have 5+ years of experience in data engineering or a similar role.
Proficiency in Databricks for managing data pipelines and analytics workflows is required.
Strong experience with Apache Spark for large-scale data processing is necessary.
Advanced knowledge of SQL for data manipulation, querying, and creating ETL processes is essential.
Expertise in Python for scripting, automation, and building data workflows is required.
Strong problem-solving and analytical skills are necessary for this position.
You should have the ability to work collaboratively with cross-functional teams.
Familiarity with real-time data streaming technologies such as Kafka and Kinesis is preferred.
Experience with Infrastructure as Code (IaC) for infrastructure and pipeline implementations is required.
Experience with cloud services, preferably AWS, is necessary.
An act-like-an-owner mentality with a bias toward taking action is essential.
Benefits:
The position offers competitive cash compensation.
Benefits are effective on day one of employment.
Employees gain early access to a high-potential, high-growth fintech environment.
Generous stock option packages are available in this early-stage startup.
The role is remote-friendly (anywhere in the US) and office-friendly, allowing you to choose your schedule.
Flexible time off programs are provided, including vacation, sick leave, paid parental leave, and paid caregiver leave.
A 401(k) plan with a match is included as part of the benefits package.