Posted on: 04/11/2025
Responsibilities :
- Responsible for design, development, and implementation of solutions to exchange data using the required ETL tool, Azure Data Factory , Synapse ,and DataBricks.
- Experience in data cleansing and data migration using Python.
- At least 2+ years of hands-on experience in Azure ADF.
- Coding in Pysense for DataBricks will be added advantage.
- Overall knowledge of modern Azure ecosystem.
- 4-8 years of delivering end to end ETL project.
- Strong verbal and written communication and shall be able to work directly with client and business to understand the domain/requirements.
- Must be able to start a conversation with client directly and suggest the best approach on his/her own.
- Strong in DWH concepts and good understanding of modern analytics tools and processes.
- Strong in SQL/ TSQL/PL/SQL, writing effective SQL codes and stored procedures.
- Experience in DBMS like SQL Server and knowledge of know-hows of the same and troubleshooting experience.
- Develop new code and/or support operations, maintenance, and enhancements of existing code
- Conduct multiple levels of testing including unit, system, integration and performance
- Estimate and plan releases.
- Understand the best practices and coding standards.
Qualifications :
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Proven experience as a Data Engineer or in a similar role.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- Ability to work in a fast-paced, agile environment.
The job is for:
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1569363
Interview Questions for you
View All