Posted on: 12/09/2025
Job Summary :
The role requires deep expertise in cloud-based data platforms, data warehousing, data lakehouse design, workflow automation, and data integration to support business intelligence and advanced analytics.
The ideal candidate will have a strong background in data engineering, cloud technologies, and full-stack software development, with a focus on performance optimization, security (especially data segregation), and automation.
Key Responsibilities :
- Develop and optimize data warehouse and data lakehouse architectures.
- Implement ETL/ELT processes, data modeling, and API integrations.
- Automate workflows and orchestrations using Airflow, Dagster, or Prefect.
- Ensure data quality, validation, governance, and compliance (GDPR, CCPA).
- Collaborate with cross-functional teams to support data-driven decision-making.
- Manage infrastructure with cloud services (AWS/Azure/GCP) and IaC tools like
Terraform/CloudFormation.
- Contribute to CI/CD pipelines, DevOps practices, and containerization (Docker, Kubernetes).
Technical Skills Required :
- Data Engineering & Warehousing : Snowflake (must have), DBT (must have), SnapLogic, ETL/ELT, APIs, Data Lakehouse architecture.
- Programming & Scripting : Advanced SQL, Python, DBT, Bash/Shell scripting.
- Cloud & Infrastructure : AWS, Azure, or GCP; Terraform; CloudFormation; Security (IAM, VPN, Encryption).
- Data Processing & Orchestration : Kafka, Kinesis, Apache Airflow, Dagster, Prefect.
- DevOps & CI/CD : Git, GitHub Actions, Jenkins, Docker, Kubernetes.
- Data Governance & Quality : Data validation, metadata management, compliance with GDPR/CCPA.
Candidate Profile :
- 5+ years of experience in data engineering and cloud platforms.
- Strong problem-solving, analytical, and communication skills.
- Ability to work independently in a remote-first environment
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1545299
Interview Questions for you
View All