Posted on: 05/11/2025
Responsibilities :
- Manage Data : Extract, clean, and structure both structured and unstructured data.
- Coordinate Pipelines : Utilize tools such as Airflow, Step Functions, or Azure Data Factory to orchestrate data workflows.
- Deploy Models : Develop, fine-tune, and deploy models using platforms like SageMaker, Azure ML, or Vertex AI.
- Scale Solutions : Leverage Spark or Databricks to handle large-scale data processing tasks.
- Automate Processes : Implement automation using tools like Docker, Kubernetes, CI/CD pipelines, MLFlow, Seldon, and Kubeflow.
- Collaborate Effectively : Work alongside engineers, architects, and business stakeholders to address and resolve real-world problems efficiently.
Requirements :
- 3+ years of hands-on experience in MLOps (4-5 years of overall software development experience).
- Extensive experience with at least one major cloud provider (AWS, Azure, or GCP).
- Proficiency in using Databricks, Spark, Python, SQL, TensorFlow, PyTorch, and Scikit-learn.
- Expertise in debugging Kubernetes and creating efficient Dockerfiles.
- Experience in prototyping with open-source tools and scaling solutions effectively.
- Strong analytical skills, humility, and a proactive approach to problem-solving.
- Experience with SageMaker, Azure ML, or Vertex AI in a production environment.
- Commitment to writing clean code, creating clear documentation, and maintaining concise pull request
Did you find something suspicious?