Posted on: 06/08/2025
We are seeking a skilled Data Engineer to design, build, and maintain robust and scalable data pipelines. The ideal candidate will have a solid understanding of data modeling, data warehousing, and both batch and streaming data processing. You should be comfortable working with SQL, ETL tools, and cloud-based environments, and possess a keen attention to detail in managing data quality and integrity.
Key Responsibilities :
- Work on data modeling to support analytics and business intelligence use cases.
- Build and manage data warehouses and data lakes.
- Develop and maintain ETL/ELT workflows to move and transform data across systems.
- Collaborate with analytics and product teams to understand data needs and deliver solutions.
- Ensure data quality, consistency, and governance across systems.
- Optimize data storage and query performance.
- Work with structured and semi-structured data from various sources (APIs, GSheets, etc.).
Required Skills & Experience :
- Strong experience in building and maintaining ETL/ELT processes.
- Knowledge of data modeling and data warehousing best practices.
- Experience with batch and streaming data processing (e.g., using Spark, Kafka, etc.).
- Proficiency in SQL and working knowledge of Google Sheets for data handling and automation.
- Familiarity with cloud platforms (AWS, GCP, or Azure) is a plus.
- Excellent problem-solving skills and attention to detail.
Good to Have :
- Knowledge of Python or other scripting languages for data manipulation.
- Exposure to data governance and data quality frameworks.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1525547
Interview Questions for you
View All