This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
ARFA Solutions, LLC is seeking a skilled DevOps (MLOps) Engineer to enhance machine learning operations.
The MLOps Engineer will be responsible for deploying machine learning models into production and automating workflows.
The role involves implementing best practices for continuous integration and continuous delivery (CI/CD) in MLOps.
Collaboration with data scientists, developers, and IT operations teams is essential for ensuring smooth operations.
Responsibilities include designing, implementing, and managing MLOps pipelines for deploying and monitoring machine learning models.
The engineer will collaborate with data scientists to understand model requirements and operationalize models into production environments.
Automating model training, testing, and deployment processes through CI/CD principles is a key task.
Monitoring and evaluating model performance in production and implementing solutions for improvement is required.
The role includes implementing data versioning, model versioning, and tracking systems to ensure reproducibility.
Ensuring security and compliance in the deployment of machine learning models is crucial.
Documenting processes and improving collaboration between teams for efficient workflows is expected.
Staying up-to-date with industry trends and technologies in MLOps and machine learning is necessary.
Requirements:
A Bachelor's degree in Computer Science, Software Engineering, Data Science, or a related field is required.
Candidates must have 5+ years of experience in DevOps or MLOps roles.
Hands-on experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn is essential.
Proficiency in programming languages such as Python, R, or Java is required.
A strong understanding of cloud platforms (AWS, Azure, or GCP) and their machine learning services is necessary.
Experience with containerization technologies, particularly Docker, and orchestration tools like Kubernetes is required.
Knowledge of CI/CD tools like Jenkins, GitLab CI/CD, or Azure DevOps is essential.
Familiarity with data pipeline tools and frameworks, such as Apache Airflow or Kubeflow, is required.
Strong problem-solving and analytical skills are necessary.
Excellent communication and teamwork abilities are essential for this role.
Benefits:
The position offers a remote work environment.
Candidates can choose between a 1099 or C2C employment arrangement.