HamburgerMenu
hirist

Data Analyst - Python/Java

Silicon Valley Construction Services
Multiple Locations
3 - 6 Years

Posted on: 22/12/2025

Job Description

Description : Data Analyst Implementation & Analytics

Location : Bangalore / Navi Mumbai / Pune / Client Locations

Experience : 3+ Years

Role Summary :

As a Data Analyst within the Implementation & Analytics team, you will serve as the technical bridge between raw data infrastructure and strategic business decision-making. This role requires a high degree of technical autonomy to design, optimize, and maintain the data pipelines that power our reporting ecosystem. You will be responsible for not only extracting insights but also ensuring the underlying database architecture is performant, scalable, and capable of handling high-volume transaction data with precision.

Responsibilities :

- Architect, develop, and optimize high-performance SQL queries and complex stored procedures to facilitate seamless data extraction and transformation.

- Design and deploy advanced analytical reports and interactive dashboards that translate multi-dimensional datasets into actionable business intelligence for stakeholders.

- Build and manage robust database structures, including the implementation of custom functions, triggers, and views to support evolving product requirements.

- Conduct rigorous database performance tuning by analyzing execution plans, optimizing indexing strategies, and refactoring legacy code to minimize latency.

- Collaborate directly with business owners and external customers to gather complex analytics requirements and translate them into technical documentation.

- Engineer automated data pipelines and scheduled reporting workflows to eliminate manual intervention and increase the velocity of data delivery.

- Implement data validation frameworks and error-handling logic within database objects to ensure the highest standards of data integrity and accuracy.

- Provide technical support during the implementation phase of new client projects, ensuring data migration and integration tasks meet strict quality benchmarks.

- Monitor and audit data usage patterns to identify opportunities for schema optimization and structural improvements in the production environment.

- Take full ownership of the end-to-end data lifecycle, from initial requirement discovery to final delivery and post-deployment maintenance.

Technical Requirements :

- Advanced mastery of SQL, including extensive experience with window functions, recursive CTEs, and complex multi-table joins across various RDBMS platforms.

- Expert-level proficiency in Microsoft Excel, specifically utilizing Power Query, advanced Pivot modeling, and complex logical/lookup functions for data synthesis.

- Proven track record of at least 3 years in a technical data role, with a strong emphasis on database development and performance optimization.

- Deep understanding of database design principles, including normalization, denormalization, and the impact of primary/foreign key constraints on query speed.

- Demonstrated ability to work independently in a fast-paced environment, managing multiple projects simultaneously with minimal supervision.

- Exceptional communication skills, with the ability to articulate technical findings and data constraints to non-technical client stakeholders.

- Experience in troubleshooting complex data discrepancies and performing root-cause analysis in distributed data environments.

Preferred Skills :

- Practical experience with Python (Pandas, NumPy) or Java for building custom data processing scripts and expanding automation capabilities.

- Familiarity with enterprise BI tools such as Power BI, Tableau, or QuickSight for creating sophisticated data visualizations and storytelling.

- Knowledge of ETL/ELT frameworks and tools like SSIS, Talend, or Apache Airflow to manage enterprise-grade data movement.

- Experience working with cloud data warehouses such as Snowflake, Amazon Redshift, or Google BigQuery.

- Familiarity with version control systems like Git to manage and track changes in SQL scripts and database migration files.

- Understanding of Agile methodologies and the ability to operate within a Sprint-based delivery model.

- Prior experience in the Fintech or Payment processing domain, focusing on transaction reconciliation and financial data modeling.


info-icon

Did you find something suspicious?