Posted on: 03/11/2025
Description :
About the Role :
We are looking for a Data Architect with a strong background in data engineering & cloud data platforms. The ideal candidate will design and implement scalable data architectures that power enterprise analytics, AI/ML, and GenAI solutions ensuring data availability, quality, and governance across the organization.
Key Responsibilities :
Data Architecture & Strategy :
- Design & Architecture : Design and implement robust, scalable, and optimized data engineering solutions on the Databricks platform. Architect data pipelines that scale efficiently and reliably.
- Data Pipeline Development : Develop ETL/ELT pipelines leveraging Databricks notebooks, Delta Lake, Snowflake tech stack, Azure Data Factory etc.
- Cloud Integration : Work closely with cloud platforms like Azure, AWS, or GCP to integrate Databricks or Snowflake with data storage (e.g., ADLS, S3, etc.), databases, and other services.
- Performance Optimization : Optimize the performance of data workflows by tuning Databricks clusters, improving query performance, and identifying bottlenecks in data processing.
- Collaboration : Collaborate with data scientists, analysts, and business stakeholders to understand business requirements and translate them into scalable data solutions.
- Data Governance & Security : Ensure best practices for data security, governance, and compliance when working with sensitive or large datasets.
- Automation & Monitoring : Automate data pipeline deployments and create monitoring dashboards for ongoing performance checks.
- Continuous Improvement : Stay up to date with the latest Databricks features and Snowflake eco system best practices to continuously improve existing systems and processes.
Required Skills & Experience :
- 14+ years of experience in Data Architecture / Data Engineering roles.
- Proven expertise in data modeling, ETL/ELT design, and cloud-based data solutions (AWS Redshift, Snowflake, BigQuery, or Synapse).
- Hands-on experience with data pipeline orchestration tools (Airflow, DBT, Azure Data Factory, etc.).
- Proficiency in Python, SQL, and Spark for data processing and integration.
- Experience with API integrations and data APIs for AI systems.
- Excellent communication and stakeholder management skills.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1568657
Interview Questions for you
View All