HamburgerMenu
hirist

Job Description

Description :


- Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights.

- Constructing infrastructure for efficient ETL processes from various sources and storage systems.

- Collaborating closely with Product Managers and Business Managers to design technical solutions aligned with business requirements.

- Leading the implementation of algorithms and prototypes to transform raw data into useful information.

- Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations.

- Creating innovative data validation methods and data analysis tools.

- Ensuring compliance with data governance and security policies.

- Interpreting data trends and patterns to establish operational alerts.

- Developing analytical tools, programs, and reporting mechanisms.

- Conducting complex data analysis and presenting results effectively.

- Preparing data for prescriptive and predictive modeling.

- Continuously exploring opportunities to enhance data quality and reliability.

- Applying strong programming and problem-solving skills to develop scalable solutions.

- Passion for testing strategy, problem-solving, and continuous learning.

- Willingness to acquire new skills and knowledge.

- Possess a product/engineering mindset to drive impactful data solutions.

- Experience working in distributed environments with global teams.

Technical Skills and Experience requirements :


- Minimum 8+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines.

- High proficiency in Python, Scala and Spark for applied large-scale data processing.

- Expertise with big data technologies, including Spark, Data Lake, Delta Lake, and Hive.

- Solid understanding of batch and streaming data processing techniques.

- Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion.

- Expert-level ability to write complex, optimized SQL queries across extensive data volumes.

- Experience with RDBMS and OLAP databases like MySQL, Snowflake.

- Familiarity with Agile methodologies.

- Obsession for service observability, instrumentation, monitoring, and alerting.

- Knowledge or experience in architectural best practices for building data lakes.

Qualification : Bachelors/Masters Degree.

Relevant Experience : 8+ Years (as Python and Data Engineer).

Overall IT Experience : 8 to 12 Years.


info-icon

Did you find something suspicious?