Posted on: 05/08/2025
Job Role : Data Engineer
Experience : 8+ Years
Mode : Hybrid
Key Responsibilities :
- Design and implement enterprise-grade Data Lake solutions using AWS (e.g., S3, Glue, Lake Formation).
- Define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion, storage, computing and processing.
- Optimize cloud infrastructure for performance, scalability, and cost-effectiveness.
- Develop and maintain ETL pipelines using tools such as AWS Glue or similar platforms. CI/CD Pipelines managing in DevOps.
- Create and manage robust Data Warehousing solutions using technologies such as Redshift.
- Ensure high data quality and integrity across all pipelines.
- Design and deploy dashboards and visualizations using tools like Tableau, Power BI, or Qlik.
- Collaborate with business stakeholders to define key metrics and deliver actionable insights.
- Implement best practices for data encryption, secure data transfer, and role-based access control.
- Lead audits and compliance certifications to maintain organizational standards.
- Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps engineers.
- Mentor junior team members and provide technical guidance for complex projects.
- Partner with stakeholders to define and align data strategies that meet business objectives.
Qualifications & Skills :
- Strong experience in building Data Lakes using AWS Cloud Platforms Tech Stack.
- Proficiency with AWS technologies such as S3, EC2, Glue/Lake Formation (or EMR), Quick sight, Redshift, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, Data and IAM.
- Expertise in AWS tools that includes Data Lake Storage, Compute, Security and Data Governance.
- Advanced skills in ETL processes, SQL (like Cloud SQL, Aurora, Postgres), NoSQL DBs (like DynamoDB, MongoDB, Cassandra) and programming languages (e.g., Python, Spark, or Scala). Real-time streaming applications preferably in Spark, Kafka, or other streaming platforms.
- AWS Data Security : Good Understanding of security concepts such as : Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager.
- Hands-on experience with Data Warehousing solutions and modern architectures like Lakehouses or Delta Lake.
- Proficiency in visualization tools such as Tableau, Power BI, or Qlik.
- Strong problem-solving skills and ability to debug and optimize application applications for performance.
- Strong understanding of Database/SQL for database operations and data management.
- Familiarity with CI/CD pipelines and version control systems like Git.
- Strong understanding of Agile methodologies and working within scrum teams.
Preferred Qualifications :
- Bachelor of Engineering degree in Computer Science, Information Technology, or a related field.
- AWS Certified Solutions Architect Associate (required).
- Experience with Agile/Scrum methodologies and design sprints.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1524991
Interview Questions for you
View All