Big Data Administrator
Telecommuting and remote work at the Durham remote location will be considered. Remote work may not commence until full VPN access is acquired which takes approximately 2 months after onboarding.
Key Required Skills
Hadoop, Ansible, DEVOPS, Flexibility and Initiative, RHLinux
Position Description
Design and implement Big Data analytic solutions on a Hadoop based platform. Refine a data processing pipeline focused on unstructured and semi-structured data refinement. Support quick turn and rapid implementations and larger scale and longer duration a
Required
- Experience in MapReduce programming with Hortonworks Hadoop and Hadoop Distributed File System (HDFS) and with processing large data stores
- Ability to show flexibility, initiative, and innovation when dealing with ambiguous and fast paced situations
- Experience with Hortonworks Hadoop, Amazon EMR,Amazon S3, Spark,
- Red Hat IDM
May be scheduled for after hours on-call support and will be required to apply production packages during non-peak hours.
Desired
- Experience with R
- Experience with Python
- Experience with deploying applications in a Cloud environment using Ansible
- Eclipse and/or IntelliJ IDE
- Java Enterprise Edition
- Git source code control
- Maven - for dependency management/builds/testing?
- Git and Bash scripting
- Atlassian Tool Suite (Confluence/Jira/Bitbucket)
Job Type: Contract