Posted on: 24/07/2025
Job Description :
Key Responsibilities :
- Data Architecture & Strategy : Lead the design and implementation of scalable, reliable, and high performance data architectures, data lakes, and data warehouses.
- ETL/ELT Pipeline Development : Architect, build, and optimize complex ETL/ELT pipelines to ingest, transform, and load data from diverse sources into various data platforms.
- Big Data Technologies : Work extensively with big data technologies and distributed computing frameworks to process and analyze large volumes of data.
- Database Management : Design, implement, and manage various database systems, including relational (PostgreSQL, MySQL), NoSQL (e.g., MongoDB, Cassandra), and data warehouses (Snowflake, BigQuery).
- Cloud Data Solutions : Leverage expertise in cloud data services (AWS, Azure, GCP) to build, deploy, and manage data infrastructure and services.
- Data Governance & Quality : Implement robust data governance, data security, and data quality frameworks to ensure data integrity and compliance.
- Performance Optimization : Identify and resolve performance bottlenecks in data pipelines and databases, ensuring efficient data processing and retrieval.
- Mentorship & Leadership : Provide technical leadership and mentorship to junior and mid-level data engineers, fostering a culture of excellence and continuous learning within the team.
- Cross-functional Collaboration : Partner closely with data scientists, analysts, software engineers, and
product managers to understand data requirements and deliver impactful data solutions.
Required Skills & Qualifications :
- 10+ years of professional experience in data engineering, with a strong focus on large-scale data systems.
- Expertise in designing and building ETL/ELT pipelines and data warehousing solutions.
- Strong proficiency in at least one major programming language for data engineering (Python, Java).
- In-depth experience with big data technologies ( Spark, Hadoop, ).
- Extensive experience with cloud data platforms (AWS Glue, S3 ; Azure Data Factory, Data Lake, Synapse; Google Cloud Dataflow, BigQuery).
- Proficiency with various database systems (relational, NoSQL, columnar databases).
- Strong knowledge of SQL for complex data manipulation and analysis.
- Experience with data governance, data security, and data quality practices.
- Familiarity with containerization (Docker, Kubernetes) and CI/CD practices for data pipelines.
- Excellent problem-solving, analytical, and architectural design skills.
- Strong communication (verbal and written) and leadership abilities.
- Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related quantitative field.
Did you find something suspicious?
Posted By
Deepthi Anupula
Talent Acquisition Specialist at Apps Associates (I) Pvt. Ltd
Last Active: 24 Nov 2025
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1518785
Interview Questions for you
View All