Posted on: 08/09/2025
Role : Data Engineer- AWS
Experience Required : Minimum 6+ Years (Relevant should be 5+ Years)
Location : Remote/ Work From Home
Job Type : Contract to Hire (1 Year Renewable)
Notice Period : Immediate to 30 Days
Mode of Interview : Virtual
- Must Have experience in : Python, SQL, AWS (as Primary Skill instead of Azure/ GCP), Snowflake, Airflow, EMR, Pyspark, CICD, Jenkins, Bitbucket, CSV, Excel
Requirements :
- Bachelors degree in Computer Science, Engineering, or a related field.
- Proficiency in Python, PySpark and SQL for data processing and manipulation.
- Min 5 years of relevant experience in data engineering, specifically working with Apache Airflow and AWS technologies.
- Strong knowledge of AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda.
- Understanding of Snowflake Data Lake is preferred.
- Experience with optimizing and scaling data pipelines for performance and efficiency.
- Good understanding of data modeling, ETL processes, and data warehousing concepts.
- Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment.
- Effective communication skills and the ability to articulate technical concepts to non-technical stakeholders.
Preferred Qualifications :
- AWS certification(s) related to data engineering or big data.
- Experience working with big data technologies like Snowflake, Spark, Hadoop, or related frameworks.
- Familiarity with other data orchestration tools in addition to Apache Airflow.
- Knowledge of version control systems like Bitbucket, Git.
The job is for:
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1541830
Interview Questions for you
View All