Posted on: 24/10/2025
Description :
What We Do :
At ClearTrail, work is more than just a job. Our calling is to develop solutions that empower those dedicated to keep their people, places and communities safe.
For over 24 years, law enforcement & federal agencies across the globe have trusted ClearTrail as their committed partner in safeguarding nations & enriching lives.
We are envisioning the future of intelligence gathering by developing artificial intelligence and machine learning based lawful interception & communication analytics solutions that solve the worlds most challenging problems.
Location : Indore/Noida.
Roles and Responsibilities :
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem.
- Installing Linux Operating System and Networking.
- Writing Unix SHELL/Ansible Scripting for automation.
- Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS,YARN, REDIS, SPARK, HBASE etc.
- Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time.
- Maintaining HBASE Clusters and capacity planning.
- Maintaining SOLR Cluster and capacity planning.
- Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected.
- Manage KVM Virtualization environment.
Qualifications :
- Experience : 3 to 5 years of experience in the role of Hadoop and Big Data Administration.
- Education : BE (IT/Computers), MCA.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1564381
Interview Questions for you
View All