Posted on: 10/09/2025
Key Responsibilities :
- Design, develop, and maintain robust, scalable, and efficient data pipelines.
- Work with structured and unstructured data to enable analytics and reporting.
- Collaborate with data scientists, analysts, and business stakeholders to understand requirements and deliver data-driven solutions.
- Optimize and automate ETL processes to ensure high data quality and integrity.
- Develop and manage data models, schemas, and integrations with data warehouses/lakes.
- Implement data governance, security, and compliance best practices.
- Monitor, troubleshoot, and improve performance of data systems.
- Stay updated on emerging trends and technologies in the data engineering space.
Required Skills & Qualifications :
- Bachelors/Masters degree in Computer Science, IT, or related field.
- 7+ years of experience in data engineering roles.
- Strong expertise in SQL, Python, and ETL development.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP).
- Proficiency in big data technologies (Hadoop, Spark, Kafka, etc.).
- Experience with data warehousing solutions (Snowflake, Redshift, BigQuery, etc.).
- Strong understanding of data modeling, data lakes, and data pipelines.
- Familiarity with Docker/Kubernetes for deployment and containerization.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
Good to Have :
- Experience with workflow orchestration tools (Airflow, Luigi, Prefect).
- Knowledge of machine learning pipelines and integration.
- Exposure to NoSQL databases (MongoDB, Cassandra).
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1544198
Interview Questions for you
View All