Posted on: 28/07/2025
Primary skills : Python Spark/ Pyspark + Databricks + SparkSQL
Secondary skills : Cloud (AWS or Azure or GCP or PCF or OCI)
Work Model : Hybrid (Weekly Twice) Cab Facility : Yes
Work Timings : Rotational shift 8am to 5pm or 1pm to 10 pm ( Rotational On call support weekends)
Interview Process : 3 rounds (Looking for local Bangalore candidates who can attend 2nd round F2F in office premises)
Why we are Looking for You?
- Senior Software Engineer is responsible for providing support to Epsilon's home grown products.
- This includes analyzing, triaging, replicating, testing, solving, provide technical work-around and suggestions to clients, pertaining to Epsilons core product.
- A Mid-level expertise of RHEL, Oracle DB, SQL, AWS and Shell/Perl scripting, combined with an experience to review java/.net code, very good understanding of networking concepts, ITIL frame work, with excellent stake holder communication and management experience.
What you will enjoy in this Role?
- Ability to analyze, troubleshoot and resolve customer issues.
- Coordinates with various functions within the company to ensure customer requests are handled/routed appropriately and in a timely manner, by owning issues until closure.
- Good abilities to establish relationships and strong Customer focus
- Serves as customer contact for technical and service related problems
- Great teammate that is ready to contribute or lead elements of solving and problem resolution
- Ability to deal with stressful situations with colleagues and customers
- Must have the ability to communicate effectively in English language both verbal and written
- Documenting troubleshooting and problem resolution steps, by determining the best course of action.
- Should be able to understand development code for debugging purposes when needed.
- Ability to learn and adapt to new technologies based on organization needs
- Ensure all tickets meet the targets for resolution, escalation, documentation & completion.
- Train and mentor team members and perform other duties as assigned.
- Track and report issues within the CPI (Continuous Product Improvement) process to assure proper resolution to ongoing issues.
- Create and report product improvement ideas including functional enhancements and supportability improvements.
Qualifications :
- Bachelor's Degree in computer or engineering related field (or related field experience).
- Minimum of 5-8 years related Product support engineering experience preferred, working directly with End-user customers.
- Mandatory skills : Python Spark/ Pyspark, Databricks, SparkSQL
- Lead and provide advanced support for Databricks clusters and environments
- Design, develop, and troubleshoot SQL queries, Python/Java scripts, and APIs for data integration and analytics.
- Collaborate with data engineers, developers, and collaborators to address technical challenges.
- Automate workflows and improve processes to enhance operational efficiency.
- RHEL operating system (certification preferred)
- A good AWS, Networking and Communication protocol knowledge
- Experience in data bricks and writing simple to medium complex SQL queries
- Shell/Perl Scripting experience with a good knowledge of API, Webhooks & HTML.
- Above average knowledge in all phases of systems analysis, the software development process and/or functional engineering principles
- Should be a self - driven individual and be able to independently handle the assignments.
- Development background in .Net would be a plus.
- Document support processes, standard methodologies, and resolution steps for recurring issues.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1520773
Interview Questions for you
View All