We are seeking an experienced Senior Data Engineer to design, build, and optimize robust data pipelines and solutions for enterprise-scale projects.
In this role, you will work within high-performing, distributed software teams to deliver secure, scalable, and high-quality outcomes.
You will contribute to system design, implement best practices for code quality and deployment, and collaborate closely with both technical teams and clients.
This position offers the opportunity to work with cutting-edge technologies, embrace continuous learning, and play a pivotal role in driving data-driven decision-making for diverse projects.
Key responsibilities include designing, developing, and maintaining data pipelines and solutions using Python, GCP, PySpark, SQL, and BigQuery.
You will support system design, development, and maintenance while upholding high personal technical quality standards.
You will define and implement structured practices for source code management, building, and deployment.
You will design and implement secure, efficient, and scalable data storage solutions.
You will execute automated testing in alignment with best practices, troubleshoot issues, and optimize performance.
You will maintain version control and build processes using appropriate tools.
You will collaborate with clients, delivery architects, and team members to ensure timely project delivery.
You will write clear documentation and contribute to internal tooling and open-source projects.
You will stay up-to-date with emerging technologies and integrate them into solutions where applicable.
Requirements:
Proven experience delivering solutions at a Senior Developer level in an enterprise environment is required.
A strong background in server-side development using Python and SQL is essential.
Hands-on experience with ELT/ETL pipelines, relational and document databases is necessary.
Production-level experience with Snowflake or Databricks is required.
Strong knowledge of modern web services, testing methodologies, and version control tools such as Git is needed.
Familiarity with AWS serverless compute, Apigee, CloudRun, containerization, and cloud architecture (AWS/GCP/Azure) is important.
Understanding of security, performance considerations, deployment tools, and systems integration is required.
Excellent analytical, communication, and collaboration skills with a high degree of empathy are essential.
(Optional) Experience with FastAPI, Flask, Django, JavaScript, Java, multiple cloud technologies, Terraform, or previous consultancy work is a plus.
Benefits:
This position offers fully remote work with a strong commitment to work-life balance.
Flexible working arrangements are available to fit around personal commitments.
You will have access to the Wellness Hub for resources and confidential support services.
The company provides annual retreats and fosters a collaborative, inclusive team culture.
A competitive remuneration package will be discussed during the process.