HamburgerMenu
hirist

Senior Data Engineer - ETL/PySpark

Renovision Automation Services Pvt. Ltd.
Multiple Locations
7 - 10 Years

Posted on: 23/07/2025

Job Description

Essential Duties and Responsibilities :



- This section contains a list of five to eight primary responsibilities of this role that account for 5% or more of the work. The incumbent will perform other duties assigned.


- Development of new ETL/data transformation jobs, using PySpark or Python in AWS.


- Enhancement and support on existing ETL/data transformation jobs.


- Can explain technical solutions and resolutions with internal customers and communicate feedback to the ETL team.


- Perform technical code reviews for peers moving code into production.


- Perform and review integration testing before production migrations.


- Provide high level of technical support and perform root cause analysis for problems experienced within area of functional responsibility.


- Can document technical specs from business communications.


Qualifications :


- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily.


- List knowledge, skills, and/or abilities required. Reasonable accommodations may be made to enable individuals with disabilities to perform essential functions.


- 6+ years of ETL experience.


- Experience with core Python programming for data transformation.


- Intermediate-level Python and PySpark skills. Can write, read, understand and debug existing code Python and ySpark code from scratch.


- Strong knowledge of SQL fundamentals and snowflake experience, understanding of subqueries, can tune queries with execution hints to improve performance.


- Able to write SQL code sufficient for most business requirements for pulling data from sources, applying rules to the data, and stocking target data


- Proven track record in troubleshooting ETL jobs and addressing production issues like performance tuning, reject handling, and ad-hoc reloads.


- Proficient in developing optimization strategies for ETL processes.


- Basic AWS technical support skills. Has ability to log in, find existing jobs and check run status and logs


- Will run and monitor jobs running via Control-M


- Can create clear and concise documentation and communications.


- Can document technical specs from business communications.


- Ability to coordinate and aggressively follow up on incidents and problems, perform diagnosis, and provide resolution to minimize service interruption


- Ability to prioritize and work on multiple tasks simultaneously


- Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.


- A self-starter who can work well independently and on team projects.


- Experienced in analyzing business requirements, defining the granularity, source to target mapping of the data elements, and full technical specification.


- Understands data dependencies and how to schedule jobs in Control-M.


- Experienced working at the command line in various flavors of UNIX, with basic understanding of shell scripting in bash and korn shell.


Education and/or Experience :


- Include the education and experience that is necessary to perform the job satisfactorily.



- Bachelors of Science in computer science or equivalent


- 7+ years of ETL and SQL experience


- 3+ years of python and PySpark experience


- 3+ Snowflake experience


- 3+ years of AWS and unix experience

Preferred certifications :

- AWS Certified Cloud Practitioner (amazon.com)

- Python and PySpark certifications


info-icon

Did you find something suspicious?