Sr DataStage ETL Developer or Architect
Large Project, planned to go for 2 to 3 years. Objective is to bring data from 100+ sources (using IBM InfoSphere DataStage 11.x) into Hadoop HDFS.
* Bachelor's Degree in Computer Science or related field.
* 8+ Years of Leading Data warehouse projects in Datastage
* 10+ Years development experience, must include ETL tool like Datastage
*Strong experience with Oracle, SQL server, SQL, PL/SQL.
* Must be Proficient in IBM Datastage 11.x,
* Experience in and Linux, REST API
* Must have Shell Scripting / Bash scripting experience
* Experience working in Agile (Scrum a plus)
* Python Experience is nice to have
Work location: Remote work for now. Need to be onsite at Plano, TX after Covid-19
Job Type: Contract
Role: PDI Developer
Length: 6 months
Our Manchester client is looking for a Pentaho Data Integrator developer to join them for 6 months working remotely with the following skill set. Must have experience with Pentaho.
General ETL knowledge
Apache, Hadoop, hive, Impala, hdfs etc.
Berlin sos Jobscheduler
We are looking for a fully Big Data Engineer for a long-term contract position with one of the largest healthcare companies in the nation. This client is going through a new big data migration that will last until the end of 2021. This candidate’s main responsibility will be working on this migration that is going from Oracle into On-Prem (Hadoop) then eventually into AWS. Additional responsibilities include:
Troubleshooting production support issues post-deployment and come up with solutions as required.
Demonstrate substantial depth of knowledge and experience in a specific areas of Big Data primarily Spark and Scala.
python, scala, AWS, Development, Engineering, Spark, Testing, AWS big data, SQL
Job Title: Hadoop Big Data Support Engineer
Location: Remote (100%)
Visa: USC, GC, H4EAD, GC EAD
Contract Type: W2
External Communities Job Description
Our client, a healthcare IT company, is in need of a Hadoop Support Engineering for the 12am-8am shift in a 24X7 environment. This is a 6 month right to hire and the client will support H1 candidates. The right candidate should have 3 years of support experience in an Hadoop environment.
Hadoop Big Data Support Engineer - Night Shift !
This operation support position is responsible for managing data lake jobs and daily production loads. This will entail the full scope of the data load processes, including source systems, the ingestion engine, process management, and SLA tracking.
This will require knowledge of data lake operations, HDFS, Kafka, NiFI, HAWQ, and medical data. This position will be required to troubleshoot and resolve data load process issues, including bad data, mapping, performance issues, and reviewing process functionality for issues.
This position must also have experience with automation, process improvement, documentation, development, SDLC, source control, and versioning as data lake development work will be required from time to time. This position will be responsible for participating in the 24x7 shift schedule. Other duties will be determined as necessary.
Enterprise Req Skills
Java, Scala, Kafka, Hadoop, shell script, HDFS
Top Skills Details
1. 3 years of hands-on with Hadoop data platform experience in a data lake environment.
2. Experience supporting production environments with demonstrated ability to troubleshoot data ingestion issues, create root cause analysis, and adhere to data availability SLA\'s
3. Solid Scripting and Programming Skills (Java, shell scripting, kafka, NiFi)
This role will cover the 12 AM - 8 AM shift for about 6 months then will move to business hours when converted perm.
Additional Skills Tags
Additional Skills & Qualifications
Experience using NiFi for data movement and Spark streaming is a plus
Job Type: Contract
Pay: $45.00 - $78.00 per hour
Whitehall Resources are looking for an experienced PDI Developer for an initial 3 month contract.
This role will be remote working initially, and Manchester based when movement restrictions are lifted.
- Ability to work within defined standards and job frameworks.
- Ensure clear understanding of requirements
- Work with Architects and Lead Developers to gain high level understanding of solution architecture
- Should actively participate in stand-ups and sprint meetings
- Experience in troubleshooting Pentaho Data Integrator server including platform and Tools issues
- Responsible for unit testing their own work and peer reviews where required to ensure accurate completion of development task
- Familiar with GIT source code repository for code version management and branching
- Experience with using PDI with relational databases
- AWS (S3)
- General ETL knowledge
- Apache, Hadoop, hive, Impala, hdfs etc.
- Berlin sos Jobscheduler
- General Scripting
Mandatory technical skills -
- Experience working with Cloudera Hadoop platforms (eg EDH)
- Knowledge of the Data Acquisition Ingestion Pipeline (at least good awareness and understanding of the stages the data goes through, soable to pick up and understand how the spreadsheets work)
- Good knowledge of Pentaho Data Integrator development skills
- They must have Pentaho experience
- Experience with using PDI with relational databases.
- Oracle and MySQL desirable. Familiar with GIT source code repository for code version management and branching.
- Operational support of system components
- Software configuration management/Version control
- Software release management/Release management of service improvements
All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description.
IAM Developer with strong ADFS (Active Directory Federation Services), OIM (Oracle Identity Manager) and SNOW (software access management). is required for a leading bank to work on a 3-month rolling contract on a remote basis. The ideal candidate will have solid working knowledge with IAM tools and processes, in particular Oracle Identity Manager, ADFS and SNOW access management
Working on new solution for leading financial services company.
Contract will fall inside IR35 so umbrella company will be required for the duration of the contract
#identitymanagement #iam #idm #staffworx #recruitmentpartner #oim #oracleidentity #pingidentity #auth0 #forgerock #contractstaffing #wfh #remoteworking #iamjobs #iamdeveloper
This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.