Posted on: 17/07/2025
Job Summary :
We are seeking a highly skilled and experienced Data Architect to join our growing data engineering team for a global consulting firm.
The ideal candidate will have a strong background in cloud data architecture, particularly on AWS, and be proficient in building scalable data pipelines, real-time processing systems, and enterprise-grade data lakes and warehouses.
You will be responsible for leading the design, development, and deployment of robust, secure, and efficient data solutions across large-scale environments.
Key Responsibilities :
- Design and implement end-to-end cloud-native data architectures on AWS to support large-scale data ingestion, transformation, and analytics.
- Build scalable and maintainable data pipelines using tools such as AWS Glue, Apache Airflow, Lambda, and Talend.
- Architect and manage data warehouses and data lakes using technologies like Amazon Redshift, S3, and Lake Formation.
- Develop and maintain real-time and batch processing pipelines using frameworks like PySpark and Kafka (optional).
- Collaborate with data engineers, analysts, and business stakeholders to translate business needs into technical requirements and scalable data models.
- Optimize and monitor performance, reliability, and security of data solutions.
- Establish data governance, quality standards, and best practices.
- Drive automation and CI/CD practices for data infrastructure using Git and other DevOps tools.
- Mentor junior engineers and participate in architecture reviews and code walkthroughs.
- Ensure compliance with data security, privacy, and regulatory requirements.
Technical Skills & Tools :
- Cloud Platforms : Deep hands-on experience with AWS (EC2, S3, Lambda, Redshift, Glue, IAM, CloudWatch).
- Data Engineering Tools : Glue, Talend, Apache Airflow, AWS Data Pipeline.
- Programming & Scripting : Expert-level in SQL, Python (Pandas, Boto3, etc.), PySpark.
- Data Storage & Warehousing : Amazon Redshift, S3, RDS, Aurora, Lake Formation.
- Data Processing : Real-time (Kinesis, Kafka preferred), Batch (Glue, EMR, Spark).
- Version Control / DevOps : Git, GitHub/Bitbucket, CI/CD pipelines (CodePipeline, Jenkins nice to have).
- Visualization & BI (optional) : QuickSight, Power BI, Tableau good to have.
- Security & Compliance : IAM, encryption, data masking, secure access policies.
Qualifications :
- Bachelors or higher degree in Computer Science, Data Engineering, Information Technology, or a related field.
- 5+ years of proven experience in cloud data engineering, data architecture, and data platform design.
- AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect Associate/Professional is highly desirable.
- Strong understanding of data architecture principles, data lifecycle, and metadata management.
- Excellent communication, problem-solving, and collaboration skills.
Preferred Qualifications :
- Experience in consulting or global service delivery environments.
- Exposure to multi-cloud environments (Azure, GCP nice to have).
- Experience with data governance frameworks and enterprise data catalog tools.
- Familiarity with Agile/Scrum methodologies
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1514953
Interview Questions for you
View All