Posted on: 11/09/2025
Job Description :
We are looking for a highly skilled Data Engineer with a strong background in designing, building, and maintaining scalable data pipelines and business intelligence solutions. The ideal candidate should have a solid mix of hands-on experience in Azure cloud services, SQL development, data integration, and reporting tools to support data-driven decision-making across the organization.
Key Responsibilities :
- Design, develop, and manage end-to-end data pipelines using Azure Data Factory (ADF) for efficient ETL/ELT processes.
- Develop, deploy, and monitor scalable data solutions using Azure Synapse Analytics or Databricks to support large-scale data processing and analytics.
- Write optimized SQL Server scripts, stored procedures, and views to support data transformation and integration requirements.
- Build and maintain insightful dashboards and reports using Power BI, with a focus on usability, performance, and clear data communication.
- Develop automation scripts using PowerShell for operational and monitoring tasks across Azure services.
- Create automated workflows using Power Automate to streamline manual tasks and notifications.
- Leverage Python or PySpark for data manipulation, transformation, and orchestration within Synapse or Databricks environments.
- Collaborate with business stakeholders, data analysts, and application developers to gather requirements and deliver high-quality solutions.
- Perform root cause analysis, data validation, and ensure data integrity across systems and processes.
- Participate in code reviews, design discussions, and documentation efforts to ensure best practices and knowledge sharing.
- Ensure security, compliance, and governance in all data engineering activities.
Required Skills and Qualifications :
- Strong proficiency in Azure Data Factory, SQL Server, and Power BI.
- 2+ years of hands-on experience with Azure Synapse Analytics or Databricks and Python for data transformation and analysis.
- Experience with PowerShell scripting for automation and Power Automate for workflow integration.
- Strong understanding of data warehousing concepts, data modelling, and performance tuning.
- Excellent analytical and problem-solving abilities.
- Strong communication skills and ability to effectively engage with both technical and non-technical stakeholders.
Preferred Qualifications (Nice to Have) :
- Knowledge of DevOps, CI/CD pipelines, and version control systems (e.g., Git).
- Familiarity with data governance, security best practices, and compliance standards (e.g., GDPR, HIPAA).
- Exposure to Agile methodologies and working in cross-functional teams.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1544716
Interview Questions for you
View All