Please, let Alternative Payments know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
As a Data Platform Engineer, you will design, provision, and operate the foundational data infrastructure that powers analytics and reporting across the organization.
You will be responsible for spinning up the cloud data warehouse and building the initial ETL pipelines to shape raw data for self-service analytics and data-driven decision making.
You will architect and spin up production and sandbox data warehouse environments using Infrastructure-as-Code (IaC).
Your role includes defining resource sizing, scaling policies, and cost-optimization strategies for the data warehouse.
You will build and deploy ETL pipelines to ingest transactional, event, and third-party data, implementing transformations, schema migrations, and incremental workflows.
You will embed data quality tests and SLA tracking into every pipeline and set up dashboards and alerts to proactively detect failures, latency spikes, or data drift.
You will establish coding conventions, pipeline templates, and best practices for future data projects and mentor other engineers and analysts on using the data platform.
You will work closely with BI/analytics teams to understand reporting needs and collaborate with application engineers to identify new data sources and automate their onboarding.
Requirements:
You must have 5+ years of experience in software or data engineering, with hands-on experience provisioning and maintaining cloud data warehouses such as Snowflake, Amazon Redshift, or Google BigQuery.
Proficiency with Infrastructure-as-Code tools like Terraform, CloudFormation, or Pulumi is required to automate data platform deployments.
Strong SQL skills and experience building ETL pipelines in Python or Java/Scala are necessary.
Familiarity with orchestration frameworks (Airflow, Prefect, Dagster) or transformation tools (dbt) is expected.
A solid understanding of data modeling patterns (star, snowflake schemas) and best practices for partitioning, clustering, and indexing is essential.
You should have experience implementing data quality checks, monitoring (using tools like Datadog, Prometheus, Monte Carlo), and alerting for pipelines.
Knowledge of data governance and security practices, including IAM, encryption at rest/in transit, and PII redaction, is required.
Excellent communication skills are necessary to partner with analytics, product, and engineering teams to translate requirements into scalable solutions.
Strong documentation skills to define data platform standards, runbooks, and onboarding guides are also required.
Benefits:
You will have the opportunity to work remotely from Brazil, providing flexibility in your work environment.
You will be part of a leading payments platform that is dedicated to fostering growth and providing exceptional value to clients.
The company values transparency, dependability, partnership, revolutionary thinking, and diversity, creating a supportive and inclusive workplace.
You will have the chance to mentor and collaborate with other engineers and analysts, enhancing your professional development and skills.
The role offers the opportunity to work on innovative projects that break down barriers and contribute to building a stronger, safer, and simpler payment processing platform.
Apply now
Please, let Alternative Payments know you found this job
on RemoteYeah
.
This helps us grow 🌱.