We are seeking a skilled and proactive Data Integration Engineer to join our remote team.
In this role, you will be responsible for creating, optimizing, maintaining, and decommissioning data integrations across a diverse and complex hybrid IT landscape.
This includes on-premises systems, SaaS platforms, and cloud environments (primarily AWS).
You will play a key role in ensuring accurate, complete, and secure data flows that support both business and security initiatives.
Key responsibilities include designing, building, and managing data workflows and integration pipelines across a hybrid IT environment.
You will ensure data integrity, observability, and resilience across internal systems and third-party platforms.
Collaborating with internal stakeholders to gather requirements and deliver end-to-end integration solutions that are scalable and future-ready is essential.
You will refactor or replace legacy data workflows to support modernized business processes.
Identifying integration inefficiencies and recommending improvements with long-term technical implications in mind is part of the role.
You will maintain and develop scripts and automation tools using Python, Bash, PowerShell, and other relevant languages.
Using DevOps practices to implement CI/CD workflows, maintain source control, and deploy changes reliably is required.
You will work with cloud services (especially AWS), containers, and both Linux and Windows environments.
Integrating with and developing solutions for Splunk, including custom Splunk Apps (primarily Python-based), is expected.
Occasionally, you will collaborate with external stakeholders to support partner or cross-company integration efforts.
Requirements:
Experience in a similar Data Integration, DevOps, or Systems Engineering role is required.
Strong experience with Python, Bash, PowerShell, or similar scripting languages is necessary.
Hands-on experience with hybrid IT environments: on-premises, SaaS, and cloud (especially AWS) is essential.
Experience with containers (e.g., Docker), and cloud-based deployment and monitoring practices is required.
Proficiency in using CI/CD tools and managing code in source control systems (e.g., Git) is necessary.
A strong understanding of secure and reliable data pipeline design and observability is required.
Familiarity with integrating and maintaining Splunk, including the development of custom apps, is essential.
The ability to gather requirements, communicate tradeoffs, and deliver complete solutions independently is necessary.
Strong communication and collaboration skills, both with internal and external stakeholders, are required.
Benefits:
The position offers the flexibility of remote work.
You will have the opportunity to work with a diverse and complex hybrid IT landscape.
The role allows for collaboration with both internal and external stakeholders, enhancing professional growth.
You will gain experience with cutting-edge technologies and practices in data integration and DevOps.
The position supports the development of skills in cloud services, scripting, and automation tools.