Posted on: 23/07/2025
Analyst - Big Data Developer
We are seeking a highly skilled Analyst -Big Data Developer to join our dynamic team.
The ideal candidate will have extensive experience with big data technologies and a strong background in developing and optimizing data integration frameworks and applications.
You will be responsible for designing, implementing, and maintaining robust data solutions in a cloud environment.
Key Responsibilities :
- Data Solution Development : Design and implement batch and real-time big data integration frameworks and applications using technologies such as Hadoop, Apache Spark, and related tools.
- Performance Optimization : Identify performance bottlenecks and apply best practices to optimize and fine tune big data frameworks.
- Programming : Develop and maintain code in multiple programming languages including Java, Scala, and Python.
- Ensure code quality and adhere to best practices.
- Schema Design : Apply principles and best practices in schema design to various big data technologies, including Hadoop, YARN, HIVE, Kafka, Oozie, and NoSQL databases like Cassandra and HBase.
- Cloud Integration : Work with cloud platforms, preferably GCP, to deploy and manage big data solutions.
- Linux Environment : Utilize system tools and scripting languages to work effectively in a Linux environment and integrate with various frameworks and tools.
- Collaboration : Collaborate with cross-functional teams to understand requirements and deliver solutions that meet business needs.
- Troubleshooting Diagnose and resolve issues related to big data applications and frameworks.
- Ensure data integrity and system reliability.
- Documentation Maintain comprehensive documentation of development processes, configurations, and operational procedures.
Required Skills and Qualifications :
- Experience : Minimum of 2 to 5 years of experience in a recognized global IT services or consulting company, with hands-on expertise in big data technologies.
- Big Data Technologies : Over 2 years of experience with Hadoop ecosystem, Apache Spark, and associated tools.
- Experience with modern big data technologies and frameworks such as Spark, Impala, and Kafka.
- Programming : Proficiency in Java, Scala, and Python with the ability to code in multiple languages.
- Cloud Platforms : Experience with cloud platforms, preferably GCP.
- Linux Environment : At least 2 years of experience working in a Linux environment, including system tools, scripting languages, and integration frameworks.
- Schema Design : Extensive experience applying schema design principles and best practices to big data technologies.
- Hadoop Distributions Knowledge of Hadoop distributions such as EMR, Cloudera, or Hortonworks.
Preferred Skills :
- Certification in relevant big data or cloud technologies.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1517635
Interview Questions for you
View All