Posted on: 17/09/2025
The Role :
We are seeking a seasoned and proficient Senior Python Data Engineer with substantial experience in cloud technologies.
As a pivotal member of our data engineering team, you will play a crucial role in designing, implementing, and optimizing data pipelines, ensuring seamless integration with cloud platforms.
The ideal candidate will possess a strong command of Python, data engineering principles, and a proven track record of successful implementation of scalable solutions in cloud environments.
Responsibilities :
Data Pipeline Development :
- Implement Extract, Transform, Load (ETL) processes to seamlessly move data from diverse sources into our cloud-based data warehouse.
Cloud Integration :
- Leverage cloud-native services for storage, processing, and analysis of large datasets.
Data Modelling and Architecture :
- Ensure the scalability, reliability, and performance of the overall data infrastructure on cloud platforms.
Optimization and Performance :
- Monitor and troubleshoot issues, ensuring timely resolution and minimal impact on data availability.
Quality Assurance :
- Collaborate with cross-functional teams to identify and address data quality issues.
Collaboration and Communication :
- Collaborate with other engineering teams to seamlessly integrate data engineering solutions into larger cloud-based systems.
Documentation :
- Create and maintain comprehensive documentation for data engineering processes, cloud architecture, and pipelines.
Technical Skills :
Programming Languages : Proficiency in Python for data engineering tasks, scripting, and automation.
Data Engineering Technologies :
- Understanding and hands-on experience with workflow management tools like Apache Airflow.
Cloud Platforms :
- Familiarity with cloud-native services for data processing, storage, and analytics.
ETL Processes : Proven expertise in designing and implementing Extract, Transform, Load (ETL) processes.
SQL and Databases : Proficient in SQL with experience in working with relational databases (e.g., PostgreSQL, MySQL) and cloud-based database services.
Data Modeling : Strong understanding of data modeling principles and experience in designing effective data models.
Version Control : Familiarity with version control systems, such as Git, for tracking changes in code and configurations.
Collaboration Tools : Experience using collaboration and project management tools for effective communication and project tracking.
Containerization and Orchestration : Familiarity with containerization technologies (e.g., Docker) and orchestration tools (e.g., Kubernetes).
Monitoring and Troubleshooting : Ability to implement monitoring solutions and troubleshoot issues in data pipelines.
Data Quality Assurance : Experience in implementing data quality checks and validation processes.
Agile Methodologies : Familiarity with agile development methodologies and practices.
Soft Skills :
- Strong problem-solving and critical-thinking abilities.
- Excellent communication skills, both written and verbal.
- Ability to work collaboratively in a cross-functional team environment.
- Attention to detail and commitment to delivering high-quality solutions.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1548182
Interview Questions for you
View All