Posted on: 02/02/2026
Description :
Key Responsibilities :
- Develop, optimize, and maintain high-quality Python applications and scripts.
- Build integrations with cloud services (AWS/Azure/GCP) and implement cloud-based solutions.
- Work with relational and NoSQL databases for data modeling, ETL processes, and performance tuning.
- Develop and maintain pipelines in Databricks (PySpark/Notebook-based workflows).
- Implement workflow orchestration using Airflow or similar tools.
- Collaborate with cross-functional teams to understand requirements and deliver reliable solutions.
- Ensure code quality, scalability, and security through best practices and code reviews.
- Monitor, troubleshoot, and enhance existing pipelines and applications.
Required Skills :
- Strong proficiency in Python and its standard libraries.
- Experience with Cloud platforms (AWS/Azure/GCP).
- Solid understanding of Databases (SQL, NoSQL, query optimization).
- Hands-on experience with Databricks and PySpark.
- Experience with Airflow or other orchestration tools (Prefect, Luigi, etc.).
- Familiarity with CI/CD pipelines and version control tools (Git).
- Strong analytical, debugging, and problem-solving skills.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1608664