HamburgerMenu
hirist

GSPANN - AWS Data Engineer

GSPANN Technologies (India) Pvt. Ltd
6 - 10 Years
Bangalore

Posted on: 19/03/2026

Job Description

Description :

Technical Skill : AWS, Big Data, Spark,/PySpark, Airflow, SQL

JD and required Skills & Responsibilities :

- Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support.

- Solve complex business problems by utilizing a disciplined development methodology.

- Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies.

- Analyse the source and target system data. Map the transformation that meets the requirements.

- Interact with the client and onsite coordinators during different phases of a project.

- Design and implement product features in collaboration with business and Technology stakeholders.

- Anticipate, identify, and solve issues concerning data management to improve data quality.

- Clean, prepare, and optimize data at scale for ingestion and consumption.

- Support the implementation of new data management projects and re-structure the current data architecture.

- Implement automated workflows and routines using workflow scheduling tools.

- Understand and use continuous integration, test-driven development, and production deployment frameworks.

- Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards.

- Analyze and profile data for the purpose of designing scalable solutions.

- Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues.

Required Skills :

- 6 + years experience developing Data and analytic solutions.

- Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark

- Experience with relational SQL

- Experience with scripting languages such as Python

- Experience with source control tools such as GitHub and related dev process

- Experience with workflow scheduling tools such as Airflow

- In-depth knowledge of AWS Cloud (S3, EMR, Databricks)

- Has a passion for data solutions.

- Has a strong problem-solving and analytical mindset

- Working experience in the design, Development, and test of data pipelines.

- Experience working with Agile Teams.

- Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders

- Able to quickly pick up new programming languages, technologies, and frameworks.

- Bachelors Degree in computer science


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in