This job has expired.
DAtec
Expired

Hadoop Administrator

Reston, VA (On-site)

Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.

Position: Hadoop Admin

Location: Columbus, OH

Responsibilities:

  • Provide administration & support for Hadoop Cluster
  • Perform troubleshooting of Hadoop ecosystem services issues, performance analysis, ensuring security, developing and testing Unix Shell scripts, scripting in Perl, Java and coding required for Hadoop administration and associated core Hadoop ecosystem.
  • Implement LDAP integration & access controls for Hadoop Business groups.
  • Provide user onboarding of Hadoop users & support of day to day tasks
  • Set up configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level
  • Setup and manage back-ups for hadoop name node metadata, configuration directories & postgres, mysql DBs for ambari, ranger etc
  • Perform services install, configure settings, ranger policy updates and other administration tasks using Ambari admin tool
  • Implement upgrades, patch updates & fixes as needed
  • Set up and manage data access policy thru ranger
  • Set up storage policies
  • Install R & Python Packages as per business needs
  • Set up ingestion jobs into HDFS, Hive using sqoop, Nifi, Storm & kafka for flat file, RDBMS & other data sources
  • Create hive structures by applying proper compression, partitioning & encryption
  • Troubleshoot & solve job failures triggering from mapreduce, sparkR, R & sqoop ingestion jobs
  • Install client connectivity for SAS, Hadoop Environments to Databases and other sources
  • Monitor Hadoop cluster job performances and perform capacity planning

Required Qualifications:

  • 4 years of experience with Hadoop Administration required of which atleast 2 years in Hortonworks or Cloudera Hadoop is mandatory
  • 4 years of experience with development of common ingestion framework, hive structure creation, compression, encryption steps
  • 4 years of experience with unix shell scripting, Perl
  • 3 years of experience with tools like Nifi , HBase, Spark, pig, storm, flume
  • 3 years of experience with Hive which includes creation of schema structures, partitioning & performance tuning
  • 3 years of experience with python, R or spark programming
  • Experience in setting up and configuring hadoop clusters, set up of kerberos, backups and High availability is required
  • Advanced knowledge of the Linux environment and ability to program in UNIX or other high-level programming languages is a plus.
  • Excellent analytical, written and oral communications skills

Preferred Qualifications:

  • 2 years of experience with Linux Administration required

Job Type: Contract

Other Hadoop contracts

Remote
0
USD
/hr

0 outside IR35 Hadoop contracts