Welcome to RemoteYeah 2.0! Find out more about the new version here.

Remote Data Engineer

at Air Apps

Posted 1 month ago | 0 applied

Description:

  • As a Data Engineer at Air Apps, you will be responsible for designing, building, and optimizing data pipelines, data warehouses, and data lakes to ensure efficient data processing and analytics.
  • You will work closely with data analysts, scientists, and software engineers to create scalable and reliable data infrastructure that supports business intelligence and machine learning initiatives.
  • This role requires expertise in data architecture, ETL processes, and cloud-based data solutions to handle large volumes of structured and unstructured data.
  • Responsibilities include designing, building, and maintaining scalable data pipelines and ETL workflows to support analytics and reporting.
  • You will develop and optimize data warehouses and data lakes using cloud platforms such as AWS, Google Cloud, or Azure.
  • Implementing real-time and batch data processing solutions for various business needs is also part of the role.
  • You will work with structured and unstructured data, ensuring proper data modeling and storage strategies.
  • Ensuring data reliability, consistency, and scalability through best practices in architecture and engineering is essential.
  • Collaboration with data analysts, scientists, and software engineers will enable efficient data access and analysis.
  • You will automate data ingestion, transformation, and validation processes to improve data quality.
  • Monitoring and optimizing query performance and data processing efficiency is a key responsibility.
  • Implementing security, compliance, and governance standards for data storage and access control is required.
  • Staying up to date with emerging data engineering trends, tools, and technologies is expected.

Requirements:

  • You should have around 4+ years of experience in data engineering, software engineering, or database management.
  • Proficiency in SQL, Python, or Scala for data processing and automation is required.
  • Hands-on experience with cloud-based data solutions such as AWS Redshift, Google BigQuery, Azure Synapse, or Snowflake is necessary.
  • Experience building ETL pipelines with tools such as Apache Airflow, dbt, Talend, or Fivetran is essential.
  • A strong understanding of data modeling, schema design, and database optimization is required.
  • Experience with big data frameworks like Apache Spark, Hadoop, Kafka, or Flink is a plus.
  • Familiarity with orchestration tools, containerization (Docker, Kubernetes), and CI/CD workflows is expected.
  • Knowledge of data security, governance, and compliance (GDPR, CCPA, SOC 2) is necessary.
  • Strong problem-solving and debugging skills with the ability to handle large-scale data challenges are required.
  • Experience working in fast-paced, data-driven environments with cross-functional teams is essential.

Benefits:

  • Air Apps offers a remote-first approach with flexible working hours.
  • You will receive Apple hardware as part of the work ecosystem.
  • An annual bonus is provided to employees.
  • Medical insurance, including vision and dental coverage, is offered.
  • Disability insurance, both short and long-term, is included.
  • A 401k plan with up to a 4% contribution is available.
  • An Air Stipend of $3,120 per year, paid over 12 monthly installments, is provided for home office, learning, wellness, and other expenses.
  • Employees will have the opportunity to attend the Air Conference 2025 in Las Vegas to meet the team, collaborate, and grow together.