Posted on: 29/10/2025
Key Responsibilities :
- Design, develop, and maintain efficient ETL pipelines using Python, SQL, AWS Glue, and Redshift.
- Automate data flows and integrations using AWS Lambda and other serverless services.
- Propose and implement improvements to existing pipelines for better performance, scalability, and maintainability.
- Collaborate on designing scalable, resilient data architectures in AWS.
- Manage infrastructure as code using Terraform.
- Participate in code reviews and use Git for collaborative version control.
- Document technical solutions and promote data engineering best practices.
Technical Requirements :
- Python: 3+ years (intermediate to advanced level).
- SQL: 3+ years, strong with complex data models.
- Big Data / PySpark: 12+ years (preferred).
- AWS (Lambda, Glue, Redshift, S3): 23 years hands-on.
- Terraform: 12 years (intermediate).
- Git: Daily use in collaborative environments.
- Strong focus on automation, error handling, logging, and testing in data pipelines.
Professional Profile :
- Proactive and improvement-driven mindset.
- Analytical thinker with strong problem-solving skills.
- Great communicator with technical and non-technical teams.
- Strong documentation habits.
Nice to Have :
- Experience with CloudWatch or other monitoring tools.
- Knowledge of event-driven architectures.
- Familiarity with Apache Airflow or other orchestration tools.
- Background in FinTech or handling financial data.
- AWS certifications are a plus.
Location: Coimbatore.
Did you find something suspicious?