Hadoop Administrator

Employer
AboutWeb
Location
Woodlawn, Maryland
Posted
Dec 15, 2016
Closes
Jun 01, 2017
Function
Administrative
Hours
Full Time


About Web is currently seeking qualified candidates for an Hadoop Administrator in Baltimore, MD.

This position is designed to act as a highly skilled technology consultant in the area of Hadoop processing. The individual will work closely with customers and development teams to create, manage, upgrade, and secure Hadoop clusters. Key success metrics for this individual include: 1) proven experience with automation; 2) advanced Linux / Windows system administration capabilities; 3) ability to thrive within a dynamic technology environment.

Main responsibilities:
•Create Hadoop ecosystem (Hadoop, Hive, Pig, Oozie, Hue, Hbase/Cassandra, Flume) using both automated toolsets as well as manual processes.
•Maintain, support, and upgrade Hadoop clusters.
•Monitor jobs, queues, and HDFS capacity.
•Balance, commission & decommission cluster nodes.
•Apply security (Kerberos / Open LDAP) linking with Active Directory and/or LDAP.
•Enable users to view job progress via web interface.
•On boarding users to use Hadoop - configuration, access control, disk quota, permissions etc.
•Address all issues, apply upgrades and security patches.
•Commission/de-commission nodes backup and restore.
•Apply "rolling" cluster node upgrades in a Production-level environment.
•Assemble newly bought hardware into racks with switches, assign IP addresses properly, firewalling, enable/disable ports, VPN etc.
•Work with virtualization team to provision / manage HDP cluster components.

Skills Requirements:

Required skills
•Minimum 3 years of Linux/Unix administration.
•Minimum 3 years of experience with (Cloudera/Hortonworks) Hadoop Administration.
•Extensive experience in Hadoop ecosystem including Spark, MapReduce, HDFS, Hive, HBase, and Zeppelin.
•1 year experience with Hadoop-specific automation (e.g. blueprints).
•1 year technical experience managing Hadoop cluster infrastructure environments (e.g. data center infrastructure).
•Demonstrable scripting experience in one or more of Python, bash, PowerShell, Perl.
•1 year experience with Puppet and / or Chef.
•1 year virtualization experience in any of VMware / Hyper-V / KVM.

Desired skills
•Certified Hadoop Admin (Cloudera/Hortonworks).
•Networking (TCP/IP, Routers, IP addressing, use of network tools).
•Analyzing data with Hive, Pig and/or HBase.
•Data Ingestion, streaming, or Importing/exporting RDBMS data using Sqoop.
•DBA experience.
•RDBMS SQL Development.
•Manage cluster hardening activities through the implementation and maintenance of security and governance components across various cluster.

Education:

Bachelors Degree in Computer Science +11 years experience.

#AW