Posted on: 05/11/2025
We are seeking a skilled Data Engineer to design, build, and maintain efficient data pipelines and architectures that support our analytics and business intelligence initiatives. The ideal candidate should have hands-on experience in data integration, ETL development, and cloud-based data platforms.
Key Responsibilities :
- Design and implement data pipelines for ingestion, transformation, and storage of large datasets.
- Develop and maintain ETL processes to ensure data quality, consistency, and availability.
- Collaborate with data analysts, scientists, and business teams to support data-driven initiatives.
- Optimize data workflows for scalability and performance.
- Manage and monitor data warehouses and data lakes on cloud platforms.
- Ensure compliance with data governance, security, and privacy standards.
Technical Skills :
- Proficiency in Python, SQL, and ETL tools (Informatica, Talend, or Apache Airflow).
- Experience with big data technologies such as Spark, Hadoop, or Kafka.
- Working knowledge of cloud platforms (AWS, Azure, or GCP).
- Familiarity with data warehousing concepts and tools like Redshift, Snowflake, or BigQuery.
- Experience with data modeling, schema design, and API integrations.
- Understanding of version control (Git) and CI/CD pipelines.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1570294
Interview Questions for you
View All