This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
We are seeking a detail-oriented and technically skilled Data Pipeline Operations Engineer to manage and execute our weekly scanning process.
This critical role ensures the timely flow of customer data through our research, scanning, and UI ingest pipeline.
The ideal candidate has a mix of programming, database, and Linux system administration skills to handle the various steps in the scanning workflow.
Responsibilities include managing the weekly scanning process, preparing input files, monitoring and troubleshooting jobs, performing data ingest, clearing data artifacts, executing post-ingest data refresh routines, performing quality checks, and identifying process bottlenecks.
Requirements:
Strong Linux command line skills are required.
Experience with Airflow or similar workflow orchestration tools is necessary.
Proficiency in Python programming is essential.
Advanced SQL knowledge for data ingest, refresh, and validation is required.
The ability to diagnose and resolve issues with long-running batch processes is necessary.
Excellent attention to detail and problem-solving skills are required.
Good communication skills to coordinate with other teams are essential.
Flexibility to handle off-hours work when needed to meet SLAs is required.
Familiarity with network scanning tools and methodologies is preferred.
Experience optimizing database performance is a plus.
Scripting skills to automate routine tasks are beneficial.
Understanding of common network protocols and services is advantageous.
Knowledge of AWS services like EC2 is preferred.
Benefits:
Competitive compensation packages, including equity, are offered.
Employer paid medical, dental, vision, disability, and life insurance is provided.
401(k) plans are available.
Flexible Spending Accounts for health and dependents are offered.