Remote Data Scientist (Machine Learning, PySpark, Databricks, Long-Range Forecasting Focus)

Posted

Apply now
Please, let Hakkoda know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • Hakkoda, an IBM Company, is a modern data consultancy that empowers organizations to realize the full value of the Snowflake Data Cloud.
  • The company provides consulting and managed services in data architecture, data engineering, analytics, and data science.
  • Hakkoda is seeking a skilled Data Scientist with 2 to 5 years of experience, specializing in Machine Learning, PySpark, and Databricks, focusing on long-range demand and sales forecasting.
  • This role is crucial for developing and implementing an automotive OEM’s next-generation Intelligent Forecast Application.
  • Responsibilities include designing, developing, and implementing scalable machine learning models for long-range forecasting challenges.
  • The position involves applying advanced time series analysis techniques and integrating them with machine learning models using PySpark on the Databricks platform.
  • The candidate will implement probabilistic forecasting methods and develop robust solutions for hierarchical and grouped long-range forecasting.
  • Building and optimizing large-scale data pipelines for processing automotive datasets is a key task.
  • The role requires collaboration with Data Engineering and IT Operations for seamless deployment and operational efficiency.
  • The candidate will evaluate model performance using relevant metrics and optimize models for improved accuracy and efficiency.
  • Effective communication of technical details and forecasting results within the technical team is essential.

Requirements:

  • A Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Applied Mathematics, or a closely related quantitative field is required.
  • The candidate must have 2 to 5 years of hands-on experience in a Data Scientist or Machine Learning Engineer role.
  • Proven experience in developing and deploying machine learning models in a production environment is necessary.
  • Demonstrated experience in long-range demand and sales forecasting is required.
  • Significant hands-on experience with PySpark for large-scale data processing and machine learning is essential.
  • Extensive practical experience with the Databricks platform, including notebooks, jobs, and ML capabilities, is required.
  • The candidate must have expert proficiency in PySpark and the Databricks platform.
  • Strong proficiency in Python and SQL is necessary.
  • Experience with machine learning libraries compatible with PySpark is required.
  • Familiarity with advanced time series forecasting techniques and distributed computing concepts is essential.
  • Hands-on experience with a major cloud provider (Azure, AWS, or GCP) in the context of using Databricks is required.
  • Familiarity with MLOps concepts and tools used in a Databricks environment is necessary.
  • Experience with data visualization tools and strong analytical skills are required.
  • The candidate must be able to troubleshoot and solve complex technical problems related to big data and machine learning workflows.

Benefits:

  • The position offers medical, dental, and vision insurance.
  • Life insurance and paid parental leave are included in the benefits package.
  • Flexible PTO options are available for employees.
  • A company bonus program is part of the compensation structure.
  • Work from home benefits are provided.
  • Technical training and certifications are offered to enhance employee skills.
  • Robust learning and development opportunities are available.
  • Employees may have the opportunity for a trip to Costa Rica.
Apply now
Please, let Hakkoda know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Position
Experience level
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback