Posted on: 21/08/2025
About the job :
PGC Digital is seeking a skilled Data Engineer with a strong background in cloud-native data applications, distributed systems, and modern data infrastructure.
If you're passionate about building robust data solutions and want to work with cutting-edge technologies in a fast-growing digital consulting firm, wed love to hear from you.
About Us PGC Digital :
PGC Digital (PradeepIT Global Consulting Pvt Ltd) is a fast-growing global technology services firm with operations across the USA, UK, Europe, Singapore, UAE, and India.
We are trusted by Tier-1 system integrators and Fortune 500 clients for delivering scalable, future-ready SAP and Digital Supply Chain solutions.
Our teams are driven by innovation, collaboration, and a focus on delivering measurable business outcomes.
Key Responsibilities :
- Design and develop data-intensive applications and pipelines in a cloud-native environment.
- Build and maintain scalable Data Warehousing solutions (Databricks, dbt, etc.
- Create and manage robust data workflows using Apache Airflow.
- Implement and maintain Spark applications with a strong understanding of Parquet, Delta Lake, and other file formats.
- Utilize Infrastructure as Code (IaC) tools like Terraform, CDK, or CloudFormation for infrastructure management.
- Work closely with DevOps to develop CI/CD pipelines and use observability tools like Datadog, Prometheus, or Grafana.
- Write clean, testable, well-documented code; participate actively in code reviews and provide constructive feedback.
Required Skills & Experience :
- 5+ years of hands-on experience in Python, JVM-based languages, and Shell Scripting in production.
- 3+ years working on cloud-native data applications with AWS (GCP experience is a plus).
- Experience designing and developing Data Warehouses with tools like Databricks and dbt.
- Strong expertise in Apache Airflow for workflow orchestration.
- Solid experience with containerization (Docker) and orchestration (Kubernetes).
- Deep understanding of data formats like Parquet and Delta Lake.
- Familiar with IaC tools like Terraform, AWS CDK, or CloudFormation.
- Experience in monitoring and observability tools like Datadog, Prometheus, or Grafana.
- Background in building CI/CD pipelines (GitHub Actions, Jenkins, Argo CD).
- Experience with unit testing and test automation frameworks.
- Strong communication skills and a collaborative, service-oriented mindset.
What We Value :
- A proactive, ownership-driven mindset.
- Focus on delivering business value over engineering novelty.
- Clear, concise technical and business documentation skills.
- Ability to thrive in a fast-paced, globally distributed team environment
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1533235
Interview Questions for you
View All