HamburgerMenu
hirist

Data Engineer - Big Data Technologies

Career Soft Solutions Pvt. Ltd.
Anywhere in India/Multiple Locations
4 - 5 Years

Posted on: 11/08/2025

Job Description

About the Role :

We are seeking a talented and motivated Data Engineer to design, build, and maintain scalable data pipelines and infrastructure that enable data-driven decision-making.

The ideal candidate will work closely with data scientists, analysts, and business stakeholders to ensure reliable data flow and integration across multiple sources and systems.


Key Responsibilities :


- Design, develop, and maintain robust and scalable data pipelines and ETL processes to ingest, transform, and integrate data from various sources.


- Build and optimize data architectures, including data warehouses, lakes, and marts.

- Work closely with data scientists and analysts to support their data requirements and enable advanced analytics and machine learning.

- Monitor and troubleshoot data workflows to ensure data quality, availability, and performance.

- Implement best practices for data governance, security, and compliance.

- Collaborate with cross-functional teams to understand business needs and translate them into technical data solutions.

- Automate repetitive data tasks and improve data processing efficiency.

- Document data workflows, architectures, and processes.

- Stay current with emerging data technologies and tools to drive continuous improvement.


Required Skills and Qualifications :

- Bachelors degree in Computer Science, Engineering, or a related field.

- Proven experience in data engineering or related roles.

- Strong proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL, Oracle).

- Experience with big data technologies such as Hadoop, Spark, Kafka, or similar.

- Proficiency in one or more programming languages such as Python, Java, or Scala.

- Experience with cloud data platforms (AWS, GCP, Azure) and associated services (Redshift, BigQuery, Azure Data Factory).

- Knowledge of ETL tools (Informatica, Talend, Apache NiFi) and workflow orchestration tools (Airflow, Luigi).

- Familiarity with data modeling, schema design, and performance tuning.

- Experience with containerization and orchestration tools (Docker, Kubernetes) is a plus.

- Strong problem-solving and analytical skills


info-icon

Did you find something suspicious?