Posted on: 14/09/2025
Key Responsibilities :
- Design, develop, test, and maintain robust data architectures, pipelines, and ETL processes.
- Ensure data quality, integrity, and security across systems and workflows.
- Optimize data systems for performance, scalability, and cost-efficiency.
- Collaborate with cross-functional teams to gather requirements and enable data-driven analytics and insights.
- Monitor, troubleshoot, and resolve issues within data workflows and systems.
- Maintain documentation for data processes, standards, and best practices.
Required Skills & Qualifications :
- Proven experience with data pipeline tools and ETL frameworks (e.g., Apache Airflow, Talend, NiFi).
- Strong proficiency in SQL and experience with relational/non-relational databases (e.g., MySQL, PostgreSQL, MongoDB).
- Hands-on experience with big data tools (e.g., Hadoop, Spark, Kafka).
- Proficiency in Python/Java/Scala for data processing.
- Knowledge of cloud platforms (AWS, Azure, GCP) and their data services.
- Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
- Strong problem-solving, troubleshooting, and communication skills.
Preferred Skills (Good to Have) :
- Knowledge of machine learning pipelines and integration.
- Exposure to data governance and compliance practices.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1545708
Interview Questions for you
View All