Job Title : Big Data Administrator
Engagement Type : Contract
Experience Required : 5+ Years
Location : Remote
Key Responsibilities :
- Manage and maintain Hadoop ecosystem components including HDFS, YARN, Hive, and Impala.
- Perform Cloudera/HDP cluster administration including patching, upgrades, and configuration management.
- Monitor system performance, diagnose issues, and implement performance tuning and optimization.
- Implement and enforce security controls, including Kerberos, Ranger, and role-based access management.
- Manage user access, permissions, and data storage policies.
- Perform proactive cluster health checks, troubleshoot job failures, and ensure efficient resource utilization.
- Support data engineers in running and debugging ETL/data lake operations.
- Automate administrative tasks and improve monitoring using tools and scripting.
- Maintain documentation related to configurations, operations, and best practices.
Key Skills & Qualifications :
- 5+ years of experience in Big Data Administration.
- Strong hands-on experience with Hadoop ecosystem HDFS, YARN, Hive, Impala.
- Proven expertise in Cloudera/Hortonworks (HDP) platform administration.
- Solid understanding of Linux OS (Red Hat/CentOS/Ubuntu) with proficiency in shell scripting.
- Experience implementing security frameworks Kerberos, LDAP, Ranger/Sentry.
- Proficient in monitoring and logging tools (e.g., Cloudera Manager, Ambari, Nagios, Grafana).
- Strong troubleshooting and problem-solving skills.
- Good understanding of data lake concepts and distributed computing.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Database Admin / Development
Job Code
1517984
Interview Questions for you
View All