HamburgerMenu
hirist

Data Engineer - ETL/Data Warehousing

HiringEye
Multiple Locations
3 - 5 Years
star-icon
4.9white-divider6+ Reviews

Posted on: 22/10/2025

Job Description

Description:



At Modern, we're looking for a driven and detail-oriented Data Engineer with a strong consulting background to join our Client Technology Team. In this role, you will serve as a subject matter expert (SME) for our Enterprise Data Product Platform, DataOS, and play a critical part in designing and implementing scalable, high-performance data solutions. You'll bridge the gap between business requirements and technical execution, ensuring that data systems are robust, efficient, and aligned with organizational goals.



Responsibilities:



- Collaborate with business stakeholders to deeply understand data needs and translate them into conceptual, logical, and physical data models.



- Partner with data stewards, analysts, and other technical teams to define and deliver data product roadmaps.



- Architect, design, and develop scalable, distributed data processing solutions leveraging DataOS.



- Enhance and optimize Spark-based workflows for efficiency, scalability, and cost-effectiveness.



- Evaluate and apply the most suitable technologies (e. g., cloud platforms, Hadoop, NoSQL, and traditional data warehousing to meet business challenges.



- Contribute to solution design discussions and provide strategic insights during planning and problem-solving sessions.



- Work in tandem with Business Analysts and Solution Architects to create data models that fulfill business objectives while maintaining alignment with Enterprise Architecture.



- Coordinate with Data Architects and Program Managers, participating in project meetings and progress discussions.



Requirements:



- Bachelor's degree in Computer Science, Business Analytics, or a related field.



- Minimum 5 years of experience in data engineering, data architecture, or analytics.



- Proficiency in Spark, Python, SQL, and API integration frameworks.



- Strong understanding of modern data architectureincluding cloud-based data lakes, data warehouses, and data marts.



- Experience in dimensional modeling, star schemas, ETL pipelines, and data streaming (Kafka).



- Proven ability to ensure data quality, integrity, and reliability using validation and monitoring tools.



- Hands-on experience with workflow orchestration tools such as DBT and Airflow (or similar).



- Practical knowledge of both relational (PostgreSQL, MySQL) and NoSQL (Redshift, BigQuery, Cassandra) databases.



- Familiarity with Docker, Kubernetes, and cloud storage technologies (Azure Data Lake, AWS S3 Google Cloud Storage).


info-icon

Did you find something suspicious?