HamburgerMenu
hirist

Job Description

Description :



We are looking for a Senior Data Engineer who is passionate about transforming data into valuable insights, thrives in a collaborative environment, and has hands-on experience with Python, SQL, Spark, ETL tools, and Data Warehousing. As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines and data warehouse solutions. You will collaborate closely with cross-functional teams, including data analysts, architects, and business stakeholders, to ensure high data quality, performance, and reliability across all data systems. You will play a key role in optimizing data workflows, implementing modern data engineering practices, and contributing to the overall data strategy of the organization.



Responsibilities :



- Design, develop, and maintain data pipelines for batch and real-time data processing.



- Build and manage data warehouse solutions to support analytics and reporting needs.



- Work with large datasets using Python, SQL, and Apache Spark for data transformation and processing.



- Implement and manage ETL / ELT workflows using modern data integration tools.



- Optimize data models, queries, and pipelines for scalability, performance, and cost efficiency.



- Ensure data quality, governance, and reliability through validation, testing, and monitoring.



- Collaborate with Data Architects and Analysts to translate business requirements into technical data solutions.



- Participate in code reviews, performance tuning, and continuous improvement initiatives.



Requirements :



- Bachelor's degree in Computer Science, Information Technology, or related field.



- 5-8 years of hands-on experience in Data Engineering.



- Strong programming skills in Python and SQL.



- Expertise in Apache Spark for distributed data processing.



- Proven experience with ETL tools (e. g., Airflow, DBT, Informatica, Talend, or similar).



- Strong knowledge of Data Warehousing concepts, Star Schema, Snowflake Schema, Fact/Dimension modeling.



- Hands-on experience with cloud platforms (AWS, Azure, or GCP) and data storage services (S3 Azure Data Lake, GCS).



- Understanding of data modeling, performance optimization, and data governance principles.



- Excellent problem-solving, analytical, and communication skills.


info-icon

Did you find something suspicious?