Please, let LevelUp know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
The GTM Tools Data Infrastructure Engineer is responsible for ensuring that sales, marketing, and customer-facing teams have access to clean, reliable, and timely data.
This role involves designing, building, and maintaining data pipelines that transfer information from GTM platforms (such as CRM, marketing automation, subscription billing, and support) into the company’s cloud data warehouse and BI layer.
The engineer's work supports dashboards, forecasting models, and analytics that inform revenue decisions.
Key responsibilities include architecting, developing, and maintaining scalable batch and streaming pipelines that ingest data from GTM systems into platforms like Snowflake, Redshift, or BigQuery.
The engineer will own RevOps data tables, create schemas, write performant SQL, monitor usage, and optimize queries for speed and cost efficiency.
Building and managing REST and GraphQL API integrations to incorporate third-party or internal data sources for enhanced revenue analytics is also a key task.
The role requires developing and scheduling automation scripts to validate, transform, and load data while applying data quality rules.
Collaboration with Revenue Operations, Finance, and Product Analytics teams is essential to understand reporting needs and translate them into robust datasets and dataflows.
Maintaining documentation for data models, pipelines, and integration points, as well as promoting data literacy across GTM stakeholders, is part of the job.
Troubleshooting pipeline failures and data discrepancies, and coordinating with DevOps or external vendors for deeper infrastructure issues, is expected.
The engineer will continuously evaluate manual data collection processes and propose automation or tooling improvements to enhance accuracy and reduce effort.
Requirements:
Candidates must have 5+ years of experience in data engineering or analytics engineering, ideally supporting GTM or Revenue Operations functions.
Advanced SQL proficiency is required, including skills in query optimization and performance tuning in large-scale data warehouses.
Hands-on experience in building ETL/ELT workflows with tools like Alteryx, Fivetran, dbt, Airflow, or similar orchestration frameworks is necessary.
A proven track record of developing API integrations (REST, SOAP, GraphQL) and managing authentication, rate limiting, and error recovery is essential.
Familiarity with BI and visualization platforms such as Power BI, Tableau, or Looker is required, along with the ability to structure tables for self-service analytics.
A strong understanding of data modeling concepts, including star/snowflake schemas, slowly changing dimensions, and change data capture (CDC), is needed.
Excellent problem-solving abilities and attention to detail when dealing with complex data sets and ambiguous requirements are crucial.
Clear communication skills for explaining technical topics to non-technical stakeholders are necessary.
The ability to manage multiple priorities, meet deadlines, and thrive in a fast-paced, data-driven culture is important.
Prior exposure to revenue operations metrics (pipeline, bookings, ARR, churn) and SaaS business models is preferred.
Benefits:
The position offers a remote work setup, allowing for flexibility in work location.
Equipment necessary for the job will be provided by the company.
Employees will receive health maintenance organization (HMO) benefits and a 13th-month pay.
The work schedule is Monday to Friday, aligning with US business hours.
Apply now
Please, let LevelUp know you found this job
on RemoteYeah
.
This helps us grow 🌱.