You're invited to the Contractor Pro beta 🎉

  • Huge discounts: £1,000s off accounting, IR35 insurance & more.
  • Community access: Our recruiter-free contractor community.
  • Subcontract work: Share unlimited jobs via the jobs board.
  • Get inbound leads: Create your public profile (coming soon!)
Early bird price: £49.99 £19.99 / mo (ex VAT)
You must be a contractor to join Contractor Pro.
Just apply

Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.

8 HDFS contracts

Spark Developer - Remote

13 days ago
$52/hourRemoteRIT Solutions, Inc.
We need candidates on our W2 max at $52/hr.

CONTRACT : 6mo. contract

Development experience with Amazon Web Services (AWS) is expected
Required Skills & Experience
• At least 3-4 years hands on experience developing with Apache Spark streaming and batch framework with Scala and PySpark (optional)
• Spark query tuning and performance optimization
• Service oriented architecture, and data standards like JSON, Avro, Parquet, Protobuf
• Experience working with SQL technologies such as PostgreSQL, Oracle or equivalent
• Experience working with HDFS, S3, and/or DynamoDB
• Having AWS Glue experience is a plus
• Experience working with Docker & Kafka
• Exposure for programming languages like Java, .Net is a plus
• Awareness about CI/CD, Jenkins is a plus 
Get new remote HDFS contracts sent to you every week.
Subscribed to weekly HDFS alerts! 🎉

Senior ETL Datastage Developer /Architect

28 days ago
$60 - $70/hour ( estimate)RemoteNumentica LLC
Full Job Description

Sr DataStage ETL Developer or Architect

Large Project, planned to go for 2 to 3 years. Objective is to bring data from 100+ sources (using IBM InfoSphere DataStage 11.x) into Hadoop HDFS.

* Bachelor's Degree in Computer Science or related field.
* 8+ Years of Leading Data warehouse projects in Datastage
* 10+ Years development experience, must include ETL tool like Datastage
*Strong experience with Oracle, SQL server, SQL, PL/SQL.
* Must be Proficient in IBM Datastage 11.x,

* Experience in and Linux, REST API
* Must have Shell Scripting / Bash scripting experience
* Experience working in Agile (Scrum a plus)
* Python Experience is nice to have

Work location: Remote work for now. Need to be onsite at Plano, TX after Covid-19

Job Type: Contract

Contract Renewal:

  • Likely

Work Remotely:

  • Temporarily due to COVID-19

PDI Developer

21 hours ago
£490 - £590/day ( estimate)RemoteSanderson

Role: PDI Developer

Location: Remote

Length: 6 months

Rate: negotiable

Our Manchester client is looking for a Pentaho Data Integrator developer to join them for 6 months working remotely with the following skill set. Must have experience with Pentaho.

Key skills:

AWS (S3)
General ETL knowledge
Apache, Hadoop, hive, Impala, hdfs etc.
Berlin sos Jobscheduler
General Scripting

Scala Developer - Remote

22 hours ago


We are looking for a fully Big Data Engineer for a long-term contract position with one of the largest healthcare companies in the nation. This client is going through a new big data migration that will last until the end of 2021. This candidate’s main responsibility will be working on this migration that is going from Oracle into On-Prem (Hadoop) then eventually into AWS. Additional responsibilities include:

Troubleshooting production support issues post-deployment and come up with solutions as required.

Demonstrate substantial depth of knowledge and experience in a specific areas of Big Data primarily Spark and Scala.


  • 3+ years of Big Data Experience (Scala and Spark(not py-spark))
  • 3+ years of recent work experience with AWS in production
  • Experience working with ANSI SQL and ETL concepts
  • Familiar with Hadoop, HDFS, Yarn and Hive
  • Familiar with CI/CD pipeline (Jenkins)
  • PLUS:

  • Experience with Bash scripting and/or Python
  • Job Requirements

    python, scala, AWS, Development, Engineering, Spark, Testing, AWS big data, SQL

    Hadoop bigdata support engineer 100% Remote

    13 days ago
    $45 - $78/hourRemoteAdwait

    Job Title: Hadoop Big Data Support Engineer
    Location: Remote (100%)
    Visa: USC, GC, H4EAD, GC EAD
    Contract Type: W2

    External Communities Job Description

    Our client, a healthcare IT company, is in need of a Hadoop Support Engineering for the 12am-8am shift in a 24X7 environment. This is a 6 month right to hire and the client will support H1 candidates. The right candidate should have 3 years of support experience in an Hadoop environment.


    Hadoop Big Data Support Engineer - Night Shift !

    This operation support position is responsible for managing data lake jobs and daily production loads. This will entail the full scope of the data load processes, including source systems, the ingestion engine, process management, and SLA tracking.

    This will require knowledge of data lake operations, HDFS, Kafka, NiFI, HAWQ, and medical data. This position will be required to troubleshoot and resolve data load process issues, including bad data, mapping, performance issues, and reviewing process functionality for issues.

    This position must also have experience with automation, process improvement, documentation, development, SDLC, source control, and versioning as data lake development work will be required from time to time. This position will be responsible for participating in the 24x7 shift schedule. Other duties will be determined as necessary.


    • Minimum 3 years of experience with Hadoop/HDP in an enterprise data lake environment
    • Experienced with Java, NiFi, Kafka, HDFS, HAWQ, Shell scripting and performance optimization
    • Excellent debugging skill
    • Independently handle data ingestion issues, create root cause analysis, and adhere to data availability SLA
    • Proactively work on reducing frequently occurring issues
    • Automate manual work and minimize manual intervention
    • Proactively monitor data ingestion finish time - handle surge in data volume
    • Maintain project metrics and identify new metrics based on the business requirements
    • Excellent written and verbal communication
    • Data lake job and process automation, management, and improvements
    • Knowledge and experience with best practices and standards around big data


    • Data analysis skills
    • Agile methodologies/SCRUM
    • Development documentation
    • Excellent interpersonal and communications skills
    • Capable of working independently and in a team
    • 24x7 support shift schedule

    Additional Skills

    • Hortonworks Data Platform and DataFlow
    • Healthcare/Medical/Insurance data experience

    Enterprise Req Skills

    Java, Scala, Kafka, Hadoop, shell script, HDFS

    Top Skills Details

    1. 3 years of hands-on with Hadoop data platform experience in a data lake environment.
    2. Experience supporting production environments with demonstrated ability to troubleshoot data ingestion issues, create root cause analysis, and adhere to data availability SLA\'s
    3. Solid Scripting and Programming Skills (Java, shell scripting, kafka, NiFi)

    Work Environment

    This role will cover the 12 AM - 8 AM shift for about 6 months then will move to business hours when converted perm.

    Additional Skills Tags

    Kafka,Hadoop,shell script,HDFS

    Additional Skills & Qualifications

    Experience using NiFi for data movement and Spark streaming is a plus

    Job Type: Contract

    Pay: $45.00 - $78.00 per hour


    • Monday to Friday
    • Night shift

    Work Location:

    • Fully Remote

    Work Remotely:

    • Yes

    PDI Developer

    4 days ago
    £490 - £600/day ( estimate)RemoteWhitehall Resources Ltd

    PDI Developer

    Whitehall Resources are looking for an experienced PDI Developer for an initial 3 month contract.

    This role will be remote working initially, and Manchester based when movement restrictions are lifted.

    - Ability to work within defined standards and job frameworks.
    - Ensure clear understanding of requirements
    - Work with Architects and Lead Developers to gain high level understanding of solution architecture
    - Should actively participate in stand-ups and sprint meetings
    - Experience in troubleshooting Pentaho Data Integrator server including platform and Tools issues
    - Responsible for unit testing their own work and peer reviews where required to ensure accurate completion of development task
    - Familiar with GIT source code repository for code version management and branching
    - Experience with using PDI with relational databases

    - PDI
    - AWS (S3)
    - General ETL knowledge
    - Cloudera
    - Apache, Hadoop, hive, Impala, hdfs etc.
    - Berlin sos Jobscheduler
    - Vault
    - Jenkins
    - Ansible
    - General Scripting

    Mandatory technical skills -
    - Experience working with Cloudera Hadoop platforms (eg EDH)
    - Knowledge of the Data Acquisition Ingestion Pipeline (at least good awareness and understanding of the stages the data goes through, soable to pick up and understand how the spreadsheets work)
    - Good knowledge of Pentaho Data Integrator development skills
    - They must have Pentaho experience
    - Experience with using PDI with relational databases.
    - Oracle and MySQL desirable. Familiar with GIT source code repository for code version management and branching.
    - Operational support of system components
    - Software configuration management/Version control
    - Software release management/Release management of service improvements

    All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description.

    Role: PDI Developer
    Job Type: Contract
    Location: Manchester, Lancashire,

    Apply for this job now.

    IAM Developer (OIM, ADFS, SNOW) contract, remote 3 months ro...

    6 days ago
    £370 - £460/day ( estimate)RemoteInside IR35Staff Worx
    Hands-on engineering experience in DevOps CI/CD, microservices and GCP or AWS. IAM Developer with strong ADFS (Active Directory Federation Services), OIM …

    IAM Developer (Oracle IDM, ADFS, SNOW) contract, remote 3 months rolli

    4 days ago
    RemoteInside IR35Staffworx Limited

    IAM Developer with strong ADFS (Active Directory Federation Services), OIM (Oracle Identity Manager) and SNOW (software access management). is required for a leading bank to work on a 3-month rolling contract on a remote basis. The ideal candidate will have solid working knowledge with IAM tools and processes, in particular Oracle Identity Manager, ADFS and SNOW access management

    Working on new solution for leading financial services company.

    • Solid commercial experience of designing and implementing Identity & Access Management (IAM) based solutions
    • software design and development experience of enterprise applications and security solutions
    • Proven experience of designing secure applications within a financial services environment
    • Hands-on engineering experience in DevOps CI/CD, microservices and GCP or AWS
    • OIM (Oracle Identity Manager)
    • ADFS (Active Directory Federation Services)
    • SNOW (Software Access Management)
    • Experience implementing and supporting IAM tools and processes
    • Experience consulting or operating IAM solutions for cloud service providers (Azure or GCP)
    • Ideally a solid understanding of Cloud Security
    • Hands on experience operating, monitoring and troubleshooting IAM services in medium to large cloud environments

    Contract will fall inside IR35 so umbrella company will be required for the duration of the contract

    #identitymanagement #iam #idm #staffworx #recruitmentpartner #oim #oracleidentity #pingidentity #auth0 #forgerock #contractstaffing #wfh #remoteworking #iamjobs #iamdeveloper

    This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.

    Role: IAM Developer (Oracle IDM, ADFS, SNOW) contract, remote 3 months rolli
    Job Type: Contract
    Location: Not Specified,

    Apply for this job now.