Posted on: 09/09/2025
Role Overview :
- Design, implement, and optimize ETL pipelines using SQL, Python, and workflow automation tools.
- Manage structured and semi-structured datasets including API-driven logs, transactional data, and operational event streams.
- Apply principles of data modeling, indexing, partitioning, and query optimization to ensure high performance.
- Establish error handling, data validation, and monitoring frameworks for robust data pipelines.
Analytics & Business Intelligence :
- Develop advanced Power BI dashboards with role-based security, optimized DAX queries, and high performance on large datasets.
- Automate data refresh cycles and implement parameterized dashboards for scalability across categories.
- Apply statistical and algorithmic techniques in Python for anomaly detection, forecasting, and trend analysis.
Operational & API Analytics :
- Work on real-time API logs to analyze order lifecycle behavior, SLA compliance, and system bottlenecks.
- Build scoring frameworks (e.g., NP scorecards, SLA compliance indices) to evaluate partner and category performance.
- Integrate external APIs and develop data ingestion pipelines to enrich operational analytics.
Collaboration & Technical Leadership :
- Partner with engineering and product teams to ensure data availability, consistency, and scalability across systems.
- Define best practices for query performance, BI reporting, and Python-driven automation.
- Mentor junior analysts in writing efficient SQL, structuring pipelines, and building production-ready dashboards.
Candidate Profile :
Education :
- Bachelors or Masters degree in Computer Science, Data Science, Statistics, or a related quantitative field.
- Strong grounding in data structures, algorithms, and database design principles.
Technical Skills :
- SQL / PostgreSQL : Advanced queries, query optimization, window functions, indexing, schema design.
- Python : Data manipulation (Pandas, NumPy), automation scripts, API integrations, basic statistical modeling.
- Power BI : DAX queries, role-level security, drill-through reports, performance tuning, visualization best practices.
- ETL / Data Pipelines : Workflow orchestration, incremental loads, error handling, data versioning.
- APIs & Logs : Parsing, transforming, and analyzing event-driven or JSON-based datasets.
- Excel (Advanced) : Complex formulas, pivoting, modeling for ad-hoc analysis.
Did you find something suspicious?
Posted By
Posted in
Data Analytics & BI
Functional Area
Data Analysis / Business Analysis
Job Code
1543231
Interview Questions for you
View All