Remote Big Data Software Engineer

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Design, develop, and maintain scalable and efficient big data infrastructure, including data storage, processing, and retrieval systems.
  • Develop algorithms, scripts, and pipelines for processing, cleaning, and analyzing large volumes of data from various sources.
  • Implement distributed computing frameworks and technologies (e.g., Hadoop, Apache Sqoop, Kafka, Apache Spark, Airflow) to process and analyze data in parallel across clusters of machines.
  • Develop data visualization tools and dashboards to present insights and findings in a clear and actionable manner for stakeholders.
  • Monitor the health and performance of big data systems, troubleshoot issues, and perform routine maintenance tasks to ensure system reliability and availability.
  • Collaborate with data scientists, analysts, and business stakeholders to understand requirements, gather feedback, and deliver solutions that meet business needs.
  • Stay informed about emerging technologies and trends in big data and contribute to research efforts to explore new techniques and tools for data processing and analysis.
  • Prepare comprehensive technical documentation for developed systems and provide ongoing technical support and guidance to team members as needed.

Requirements:

  • Bachelor's Degree in Computer Engineering/Science, or equivalent practical experience
  • Minimum 2+ Years of Big Data Engineering experience required
  • In-depth knowledge of Hadoop, Apache Sqoop, Kafka, Apache Spark, Airflow and similar frameworks
  • Good knowledge of Big Data querying tools, such as Hive, and Hbase
  • Minimum 1 year of experience with Java.
  • Minimum 1 year of experience with Python.
  • Knowledge of scripting languages including shell scripting and Python
  • Experience Cloudera CDH/CDP installation, configuration, monitoring, cluster security, cluster resources management, maintenance and performance tuning
  • Designing the architecture of a big data platform, monitoring and maintaining the environment using best practices
  • Good knowledge of relational databases, industry practices, techniques, and standards
  • Passionate about learning big data, new technologies, open source technologies
  • Creative and innovative problem-solving skills
  • Good team player, result-oriented attitude and analytical mind
  • Multitasking, time, and stress management
  • Advanced level of English

Benefits:

  • Remote working and flexible time off
  • Opportunity to get company paid Professional Certificates (Google Cloud Platform, Confluent Kafka, etc)
  • Access to Online Training Platforms (Udemy, Pluralsight, A Cloud Guru, Coursera, etc.)
  • Opportunity to work on international projects
  • Private Health Insurance
  • Birthday Leave Policy
  • Dynamic work ecosystem where you can take initiative and responsibility
  • Open communication, flexibility and start-up spirit
  • Learning & Development opportunities for both personal and professional growth
About the job
Posted on
Job type
Salary
-
Location requirements
Position

-

Experience level
OY
Oredata Yazılım Limited Şirketi's company logo
Oredata Yazılım Limited Şirketi
View company profile Visit oredata.com
Leave a feedback