Posted on: 30/10/2025
Key Responsibilities & Technical Deliverables :
1. Data Pipeline Engineering Pipeline Development : Architect, build, and optimize complex data pipelines and ETL/ELT workflows to ingest, process, and transform massive data sets from diverse sources.
2. Automation & Transformation : Develop automation scripts and complex data transformations primarily using Python or Groovy.
3. Data Modeling : Design and implement optimized data warehouse models to support high-performance enterprise analytics and reporting.
4. Infrastructure Management : Manage and maintain data infrastructure across AWS cloud environments and existing on-premise setups, ensuring seamless hybrid operations.
5. Data Quality : Ensure the highest standards of data accuracy, performance, and reliability at scale, implementing rigorous testing and validation protocols.
6. Technology & Collaboration : Work on high-impact data infrastructure projects in a fast-paced, collaborative environment.
Required Skills & Technical Expertise :
- 6-10 years of experience in Data Engineering or a closely related quantitative role.
- Strong proficiency in SQL for data manipulation and optimization, and Python for development and automation.
- Hands-on experience with AWS cloud services and proven experience with data warehousing solutions e.g., Redshift, Trino, or advanced PostgreSQL.
Nice to Have Qualifications :
- Familiarity with orchestration tools like Airflow or NiFi.
- Real-Time Processing : Experience with streaming platforms like Kafka, Spark, or Flink.
- Containerization : Working knowledge of Docker and Kubernetes for deployment and container management.
- BI Tools : Experience with analytics platforms such as Athena or Metabase.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1567689
Interview Questions for you
View All