Remote Senior Data Engineer

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Paddle offers a unique payment infrastructure for digital product companies, acting as a Merchant of Record to alleviate payment fragmentation.
  • The Senior Data Engineer will be responsible for delivering technical solutions to implement Paddle’s data strategy.
  • The role involves working with cross-functional teams to build practical solutions, create secure data pipelines, and prepare data for analysis.
  • Key responsibilities include designing, developing, and maintaining the company's data infrastructure to support data-driven decision-making and business operations.
  • The engineer will assist internal teams with operational data support requests and build efficient and secure data pipelines for smooth data flow.
  • Data transformation techniques will be employed to prepare data for analysis, ensuring data accuracy, completeness, and consistency through quality checks.
  • The role requires implementing DevOps practices for code deployment and maintenance in production environments.
  • Continuous evaluation and optimization of the data infrastructure's performance and scalability is expected.
  • Staying updated with the latest industry trends, tools, and technologies related to data engineering is essential.

Requirements:

  • Candidates should have good knowledge of database management systems such as SQL, NoSQL, and columnar databases.
  • A solid understanding of data modeling, data warehousing, and ETL concepts is required.
  • Hands-on experience with cloud-based platforms such as AWS, GCP, or Azure is necessary.
  • Experience with data integration tools like Fivetran, Meltano, Airbyte, or Stitch is expected.
  • Familiarity with BI tools such as Looker, Tableau, or Sisense is important.
  • Knowledge of DBT, Apache Airflow, or similar tools is desirable.
  • Strong attention to detail is needed to identify and resolve data quality issues.
  • Excellent communication skills and the ability to collaborate effectively with cross-functional teams are essential.
  • Nice to have: Knowledge about infrastructure as code (IaC), experience implementing CI/CD for data pipelines, and a background in machine learning pipelines or MLOps.

Benefits:

  • Paddle offers a diverse and collaborative work culture that values transparency and respect.
  • Employees receive attractive salaries, stock options, retirement plans, and private healthcare.
  • Well-being initiatives are part of the benefits package.
  • The company is ‘digital-first’, allowing remote work, work from stylish hubs, or a combination of both.
  • Team members enjoy unlimited holidays and enhanced parental leave.
  • Paddle invests in learning and personal development through exposure to new challenges, an annual learning fund, and regular training opportunities.
About the job
Posted on
Job type
Salary
-
Position
Experience level
Leave a feedback