HamburgerMenu
hirist

ETL Developer - Power BI/Big Data

Xped pvt Ltd
Multiple Locations
2 - 10 Years

Posted on: 27/11/2025

Job Description

Job Responsibilities :

- Development and optimization of scalable data pipelines for efficient extraction, transformation, and loading (ETL) of data from diverse sources, leveraging technologies like Spark, Dask, and other modern ETL frameworks.

- Develop and implement data quality controls, including monitoring and remediation processes

- Provide technical guidance and mentorship to junior ETL developers

- Collaborate with the infrastructure team to ensure the availability, reliability, and scalability of ETL solutions

- Participate in code reviews and contribute to the development of coding standards and best practices

- Stay up-to-date with emerging trends and technologies in ETL development and data management

- Architect and implement advanced data models for both NoSQL (e.g., HBase) and relational databases, ensuring high performance, scalability, and cost efficiency in data storage and retrieval processes.

- Establish robust systems to ensure data quality, implementing monitoring tools and validation frameworks to guarantee accuracy and availability of production data for key business stakeholders.

- Design, build, and maintain high-performance APIs to provide seamless access to curated data sets for both internal teams and external clients.

- Oversee the assembly of large, complex data sets that meet functional, non-functional, and business requirements, enabling data-driven decision-making across the organization.

- Conduct root cause analysis on internal and external data and processes to resolve business inquiries, support client requests, and proactively identify areas for system and process improvement.

- Collaborate closely with cross-functional teams, providing leadership and technical guidance to ensure alignment on data architecture and delivery goals.

Technical Requirements :

- 2+ years of experience in developing, testing, and maintaining applications using Python, Bash, and SQL, with a focus on delivering scalable solutions.

- Strong expertise in writing clean, efficient, and maintainable code, with a deep understanding of design patterns and best practices.

- In-depth knowledge of SQL and NoSQL databases, including database design and optimization techniques.

- Extensive experience with parallel processing and distributed computing frameworks like Hadoop, Spark, or Dask.

- Proven track record in custom ETL design, implementation, and operational maintenance in complex environments.

- Familiarity with ETL orchestration frameworks such as Luigi, Apache Airflow, or Apache NiFi is an advantage.

- Proficiency in backend frameworks like Django or Flask, with demonstrated experience in building and maintaining web services and APIs.

- Strong experience with cloud platforms (AWS, GCP, Microsoft Azure), including cloud-based data storage and processing solutions.

- Exposure to data analytics, data science, or machine learning workflows is a plus.

info-icon

Did you find something suspicious?