Find A Job
(Ideally should not be more than 3-5) 1.Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise 2.Monitor Hadoop cluster connectivity and security 3.High Level Understanding of Tools Infratsture Life Cycle (Streamsets , Kafka , CDSW, Kinetica, Jenkins, ZoomData and other analytical tools ) 4.Co-ordinate in vulnerability scans , Cloudera TSB-s and maintaining the risk register and present to Management.
Good-to3.Aligning with the UNiX / Database engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
4.Working with data delivery teams to setup new Hadoop users.This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
5.cluster CDH upgrades & patch updates . 6.Do impact analysis caused due to hardware upgrade , OS upgrades & Patching . JDK upgrades. 7.Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability. 8.Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required. 9.Vendor Case Management