This job has expired.
Adwait
Expired

Hadoop bigdata support engineer 100% Remote

$45 - $78/hourRemote

Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.

Job Title: Hadoop Big Data Support Engineer
Location: Remote (100%)
Visa: USC, GC, H4EAD, GC EAD
Contract Type: W2

External Communities Job Description

Our client, a healthcare IT company, is in need of a Hadoop Support Engineering for the 12am-8am shift in a 24X7 environment. This is a 6 month right to hire and the client will support H1 candidates. The right candidate should have 3 years of support experience in an Hadoop environment.

Description

Hadoop Big Data Support Engineer - Night Shift !

This operation support position is responsible for managing data lake jobs and daily production loads. This will entail the full scope of the data load processes, including source systems, the ingestion engine, process management, and SLA tracking.

This will require knowledge of data lake operations, HDFS, Kafka, NiFI, HAWQ, and medical data. This position will be required to troubleshoot and resolve data load process issues, including bad data, mapping, performance issues, and reviewing process functionality for issues.

This position must also have experience with automation, process improvement, documentation, development, SDLC, source control, and versioning as data lake development work will be required from time to time. This position will be responsible for participating in the 24x7 shift schedule. Other duties will be determined as necessary.

Technical

  • Minimum 3 years of experience with Hadoop/HDP in an enterprise data lake environment
  • Experienced with Java, NiFi, Kafka, HDFS, HAWQ, Shell scripting and performance optimization
  • Excellent debugging skill
  • Independently handle data ingestion issues, create root cause analysis, and adhere to data availability SLA
  • Proactively work on reducing frequently occurring issues
  • Automate manual work and minimize manual intervention
  • Proactively monitor data ingestion finish time - handle surge in data volume
  • Maintain project metrics and identify new metrics based on the business requirements
  • Excellent written and verbal communication
  • Data lake job and process automation, management, and improvements
  • Knowledge and experience with best practices and standards around big data

Technical

  • Data analysis skills
  • Agile methodologies/SCRUM
  • Development documentation
  • Excellent interpersonal and communications skills
  • Capable of working independently and in a team
  • 24x7 support shift schedule

Additional Skills

  • Hortonworks Data Platform and DataFlow
  • Healthcare/Medical/Insurance data experience

Enterprise Req Skills

Java, Scala, Kafka, Hadoop, shell script, HDFS

Top Skills Details

1. 3 years of hands-on with Hadoop data platform experience in a data lake environment.
2. Experience supporting production environments with demonstrated ability to troubleshoot data ingestion issues, create root cause analysis, and adhere to data availability SLA\'s
3. Solid Scripting and Programming Skills (Java, shell scripting, kafka, NiFi)

Work Environment

This role will cover the 12 AM - 8 AM shift for about 6 months then will move to business hours when converted perm.

Additional Skills Tags

Kafka,Hadoop,shell script,HDFS

Additional Skills & Qualifications

Experience using NiFi for data movement and Spark streaming is a plus

Job Type: Contract

Pay: $45.00 - $78.00 per hour

Schedule:

  • Monday to Friday
  • Night shift

Work Location:

  • Fully Remote

Work Remotely:

  • Yes

Other Hadoop contracts

Remote
0
USD
/hr

0 outside IR35 Hadoop contracts