Remote Data Engineer

at redbee

Posted 1 day ago 1 applied

Description:

  • Join a curious and collaborative team that is passionate about their work and always seeks to deliver their best version.
  • At redbee, you will find a space to grow and develop your skills in a collaborative work environment that shares knowledge and focuses on the impact and value added to the business.
  • As a Senior Data Engineer at redbee, you will extract, clean, transform, and properly store the data required for the business units.
  • You will design and manage the data integration pipelines required for the business units.
  • You will manage database integrations, data pipelines, and data catalogs.
  • You will create new pipelines based on the existing needs of the business units.
  • You will develop high-quality tools/services for Advanced Analytics.
  • You will develop scalable, maintainable, testable, and future-proof software.
  • You will develop ETLs to transform data into useful and actionable information.
  • You will utilize CI/CD guidelines and policies currently used in Advanced Analytics.
  • You will monitor the stability and execute quality controls of the data pipeline.
  • You will continuously improve the quality of the pipeline.
  • You will create, test, and maintain database pipeline architectures.
  • You will strive for cost and performance optimization, as well as information security.

Requirements:

  • Advanced knowledge of SQL and experience with relational databases is required.
  • Experience in developing and optimizing big data pipelines is necessary.
  • A strong analytical skill set to work with unstructured data is expected.
  • Numerical/statistical skills for data interpretation are required.
  • Knowledge of stream processing, message queues, and scalability in big data warehouses is essential.
  • Experience working and providing support in cross-functional teams is required.
  • Experience with big data tools such as Hadoop, Spark, and Kafka is necessary.
  • Experience with SQL and NoSQL databases, such as Postgres or Cassandra, is required.
  • Experience with data pipelines and workflow management tools like Azkaban, Luigi, or Airflow is necessary.
  • Experience with cloud services from AWS/GCP, including EC2, EMR, RDS, and Redshift (or equivalents) is required.
  • Experience with stream-processing systems such as Storm or Spark-Streaming is necessary.
  • Proficiency in programming languages such as Python, Java, C++, or Scala is required.

Benefits:

  • You will be part of a team that innovates with technology and transforms clients, industries, methodologies, and people.
  • You will have the opportunity to think, build, and launch high-quality software solutions.
  • You will work in an environment that encourages continuous learning and skill development.
  • You will be part of a company that values collaboration and knowledge sharing.
  • You will have the chance to contribute to the development of technological products that have a significant impact.