Posted on: 22/10/2025
Job Description :
Key Responsibilities :
- Develop and maintain scalable Python-based data pipelines and ETL workflows on AWS (S3, Glue, Lambda, Redshift, Athena).
- Strong Python programming skills (Pandas, PySpark, GeoPandas, Airflow).
- Hands-on experience with AWS cloud data services and architecture.
- Experience processing geospatial and insurance data.
- Knowledge of risk modeling data workflows preferred.
- Strong SQL, data modeling, and distributed system skills.
Nice to Have :
- Experience with Kafka, Spark, Docker, Kubernetes.
- Understanding of data governance and security in insurance/geospatial sectors.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1563737
Interview Questions for you
View All