HamburgerMenu
hirist

AWS Data Architect - Python/Apache Airflow

Vision Excel Career Solutions
Hyderabad
10 - 15 Years

Posted on: 26/11/2025

Job Description

Role : AWS Data Architect

Job Description :

Candidate should possess a deep understanding of big data technologies, cloud services, and data architecture, with a proven track record of leading data-driven projects to successful completion.

Key Responsibilities :

- Lead a team of data engineers, providing technical guidance and mentorship.

- Develop and execute a strategic roadmap for data processing, storage, and analytics in alignment with organizational goals.

- Design, implement, and maintain robust data pipelines using Python and Airflow, ensuring efficient data flow and transformation for analytical and operational purposes.

- Utilize AWS services, including S3 for data storage, Glue and EMR for data processing, and orchestrate data workflows that are scalable, reliable, and secure.

- Implement real-time data processing solutions using Kafka, SQS, and Event Bridge, addressing high-volume data ingestion and streaming needs.

- Oversee the integration of diverse systems and data sources through AppFlow, APIs, and other integration tools, ensuring seamless data exchange and connectivity.

- Lead the development of data warehousing solutions, applying best practices in data modelling to support efficient data storage, retrieval, and analysis.

- Continuously monitor, optimize, and troubleshoot data pipelines and infrastructure, ensuring optimal performance and scalability.

- Ensure adherence to data governance, privacy, and security policies, implementing measures to protect sensitive data and comply with regulatory requirements.

Qualification :

- Bachelor's or master's degree in computer science, Engineering, or a related field.

- 8-10 years of experience in data engineering, with at least 3 years in a leadership role.

- Proficient in Python programming and experience with Airflow for workflow management.

- Strong expertise in AWS cloud services, particularly in data storage, processing, and analytics (S3, Glue, EMR, etc.).

- Experience with real-time streaming technologies like Kafka, SQS, and Event Bridge.

- Solid understanding of API based integrations and familiarity with integration tools such as AppFlow.

- Deep knowledge of data warehousing concepts, data modelling techniques, and experience in implementing largescale data warehousing solutions.

- Demonstrated ability to lead and mentor technical teams, with excellent communication and project management skills.

info-icon

Did you find something suspicious?