Posted on: 28/07/2025
Key Responsibilities :
- Data Integration & APIs : Lead efforts in integrating data from multiple and diverse sources, including databases, APIs, and third-party services. Develop and manage APIs for data access and distribution, ensuring secure and efficient data exchange.
- Snowflake Management : Optimize and manage Snowflake environments for data warehousing, ensuring optimal performance, scalability, and cost management. implement data modeling strategies and best practices within Snowflake.
- Cloud & Infrastructure Management : Architect and optimize cloud infrastructure for data solutions using AWS services, ensuring robust security, high availability, and disaster recovery.
Qualifications :
- Expertise in AWS services (e.g., S3, Glue, Lambda).
- Extensive experience with Snowflake, including data warehousing, performance tuning, and
cost optimization.
- Proficiency in ETL tools, data integration, and API development.
- Experience with data visualization tools such as Tableau, Power BI, or Looker.
- Strong programming skills in Python, SQL, and one or more data pipeline orchestration tools
(e.g., Apache Airflow).
- Exceptional problem-solving skills and ability to handle complex data challenges.
- Excellent communication and collaboration skills, with experience working in cross-functional
teams.
- Demonstrated experience working within Agile Scrum frameworks to deliver iterative and
incremental value.
- Proficient in collaborating with cross-functional Scrum teams, actively participating in key
ceremonies such as sprint planning, daily stand-ups, sprint reviews, and retrospectives to
ensure continuous delivery and improvement.
- Strong understanding of Agile principles and Scrum practices, with a proven track record of effectively breaking down complex projects into manageable sprints and iterations.
- Capable of adapting workflows and processes to enhance efficiency and meet evolving project requirements.
- Experience in utilizing Agile project management tools such as Jira or Trello for tracking progress, managing backlogs, and maintaining transparency with stakeholders.
- Adept at fostering a culture of collaboration, accountability, and continuous feedback within Agile teams.
- Strong understanding of data security, privacy regulations, and compliance.
- Certifications in AWS or Snowflake are a plus.
- Data Pipeline Development : Design, develop, and maintain scalable data pipelines and ETL
processes using AWS services (e.g., Glue, Lambda, S3) and Snowflake. Ensure efficient data ingestion, transformation, and extraction processes tailored to business needs.
- Data Integration & APIs : Lead efforts in integrating data from multiple and diverse sources, including databases, APIs, and third-party services. Develop and manage APIs for data access and distribution, ensuring secure and efficient data exchange.
- Snowflake Management : Optimize and manage Snowflake environments for data warehousing, ensuring optimal performance, scalability, and cost management. implement data modeling strategies and best practices within Snowflake.
- Cloud & Infrastructure Management : Architect and optimize cloud infrastructure for data solutions using AWS services, ensuring robust security, high availability, and disaster recovery.
Qualifications :
Expertise in AWS services (e.g., S3, Glue, Lambda) :
- Extensive experience with Snowflake, including data warehousing, performance tuning, and
cost optimization.
- Proficiency in ETL tools, data integration, and API development.
- Experience with data visualization tools such as Tableau, Power BI, or Looker.
- Strong programming skills in Python, SQL, and one or more data pipeline orchestration tools
(e.g., Apache Airflow).
- Exceptional problem-solving skills and ability to handle complex data challenges.
- Excellent communication and collaboration skills, with experience working in cross-functional teams.
- Demonstrated experience working within Agile Scrum frameworks to deliver iterative and incremental value.
- Proficient in collaborating with cross-functional Scrum teams, actively participating in key ceremonies such as sprint planning, daily stand-ups, sprint reviews, and retrospectives to ensure continuous delivery and improvement.
- Strong understanding of Agile principles and Scrum practices, with a proven track record of effectively breaking down complex projects into manageable sprints and iterations.
- Capable of adapting workflows and processes to enhance efficiency and meet evolving project requirements.
- Experience in utilizing Agile project management tools such as Jira or Trello for tracking progress, managing backlogs, and maintaining transparency with stakeholders.
- Adept at fostering a culture of collaboration, accountability, and continuous feedback within Agile teams.
- Strong understanding of data security, privacy regulations, and compliance.
- Certifications in AWS or Snowflake are a plus.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1520561
Interview Questions for you
View All