Remote Senior Data Engineer (Snowflake)

at 3Pillar

Posted 2 days ago 2 applied

Description:

  • Join 3Pillar as a Senior Data Engineer and embark on an exciting journey in software development.
  • Contribute to projects that reshape data analytics for clients, providing them with a competitive advantage.
  • Lead the design and implementation of scalable ETL/data pipelines using Python and Luigi for data processing.
  • Ensure efficient data processing for high-volume clickstream, demographics, and business data.
  • Guide the team in adopting best practices for data pipeline development, code quality, and performance optimization.
  • Configure, deploy, and maintain AWS infrastructure, primarily AWS EC2, S3, RDS, and EMR, to ensure scalability, availability, and security.
  • Support data storage and retrieval workflows using S3 and SQL-based storage solutions.
  • Provide architectural guidance for cloud-native data solutions and infrastructure design.
  • Oversee legacy framework maintenance, identify improvement areas, and propose comprehensive cloud migration or modernization plans.
  • Lead the strategic planning, execution, and optimization of large-scale data migration initiatives to Snowflake, ensuring data integrity, security, and minimal business disruption.
  • Coordinate infrastructure changes with stakeholders to align with business needs and budget constraints.
  • Develop and implement robust monitoring solutions to track system health, performance, and data pipeline accuracy.
  • Set up alerts and dashboards for proactive issue detection and collaborate with cross-functional teams to resolve critical issues.
  • Lead efforts in incident response, root cause analysis, and post-mortem processes for complex data engineering challenges.
  • Document workflows, troubleshooting procedures, and code for system transparency and continuity.
  • Provide mentoring and training to team members on best practices and technical skills.
  • Foster a culture of continuous learning, knowledge sharing, and technical excellence within the data engineering team.

Requirements:

  • Candidates must have 6 years of experience in data engineering or a related technical field, with at least 4 years in Snowflake projects.
  • Proven expertise with Snowflake data warehousing, including schema design, efficient data loading (e.g., Snowpipe, COPY into), performance tuning, and robust access control mechanisms is required.
  • Strong programming skills in Python for automation and data workflows are essential.
  • Expertise in managing SQL databases for data storage and query optimization is necessary.
  • Familiarity with monitoring solutions for real-time tracking and troubleshooting of data pipelines is required.
  • Experience with the AWS Cloud and data engineering services is a must.

Benefits:

  • Join a dynamic team focused on leveraging cutting-edge technologies to revolutionize industries.
  • Engage in thrilling projects that make a real-world impact in data analytics.
  • Opportunity for continuous learning and professional development within a culture of knowledge sharing and technical excellence.
  • Work in an environment that values innovation and data-driven decision making.