HDFS Contract Jobs

Senior Hadoop Architect

2 days ago
Harnham - London, UK

This is an opportunity for the highly motivated and skilled individual who want to enhance their current skillset and knowledge working in a dynamic environment. With a key focus on standardising the organizations Big Data Projects on the Hortonworks Data Platform.

THE COMPANY

This bank is one of the largest financial services not only in the UK, but across Europe and the US. This bank has the scope to develop its employees through a wide array of different platforms where you are recognised for your efforts and work that you put in. With a portfolio all over the globe, they are looking for talented and driven leaders to embark on this journey with them.

THE ROLE

The Hadoop team currently supports 5 ecosystems, with a total of 160 nodes with further plans of expansion on the team's resources. Here more Hadoop ecosystems from other areas within the business fiction will fall under this team.

  • Work with development teams pods and product owners to create and implement solutions based on business requirements and product features.
  • Support the solution design capability within this growing team.
  • Be at the forefront of providing Level 3 support to L1 & L2 teams in all aspects of production and non-production users and technical issues.
  • Mentor the more junior members within the team.
  • Planning and implementing product upgrades.

SKILLS REQUIRED

  • Hortonworks & Hadoop experience is essential.
  • Deep knowledge of HDFS, Hive & Yarn.
  • Knowledge of Spark and experience in implementing Knox Gateway.
  • Working experience with Google Cloud Platform and knowledge of Python and PySpark.
  • The ability to work under pressure, collaborate, interact and engage with different business units/experts.


HOW TO APPLY

Please register your interest by sending your CV to Krishen Patel Via the Apply link on this page.

Senior Hadoop Architect

2 days ago
Harnham - London, UK

This is an opportunity for the highly motivated and skilled individual who want to enhance their current skillset and knowledge working in a dynamic environment. With a key focus on standardising the organizations Big Data Projects on the Hortonworks Data Platform.

THE COMPANY

This bank is one of the largest financial services not only in the UK, but across Europe and the US. This bank has the scope to develop its employees through a wide array of different platforms where you are recognised for your efforts and work that you put in. With a portfolio all over the globe, they are looking for talented and driven leaders to embark on this journey with them.

THE ROLE

The Hadoop team currently supports 5 ecosystems, with a total of 160 nodes with further plans of expansion on the team's resources. Here more Hadoop ecosystems from other areas within the business fiction will fall under this team.

  • Work with development teams pods and product owners to create and implement solutions based on business requirements and product features.
  • Support the solution design capability within this growing team.
  • Be at the forefront of providing Level 3 support to L1 & L2 teams in all aspects of production and non-production users and technical issues.
  • Mentor the more junior members within the team.
  • Planning and implementing product upgrades.

SKILLS REQUIRED

  • Hortonworks & Hadoop experience is essential.
  • Deep knowledge of HDFS, Hive & Yarn.
  • Knowledge of Spark and experience in implementing Knox Gateway.
  • Working experience with Google Cloud Platform and knowledge of Python and PySpark.
  • The ability to work under pressure, collaborate, interact and engage with different business units/experts.


HOW TO APPLY

Please register your interest by sending your CV to Krishen Patel Via the Apply link on this page.

Data Engineer - Contract

4 days ago
Burns Sheehan - London, UK
Data Engineer
£450 - £500 per day
6 months
SQL - Azure
London
This is an exciting opportunity to join a cross-functional data team who have seen tremendous growth over the past 12 months. You'll be heavily involved in the re-platforming of their data architecture onto Azure and will utilise Microsoft's range of cloud technology.
The team is made up of close to twenty people consisting of Data Engineers, DBAs, and Data Scientists and they've created a culture which focuses on learning with regular demos and workshops being held by various teams.
You'll be responsible for working alongside engineers to develop greenfield Microservices on Azure whilst building ETL and real time data pipelines that meet the needs of a fast growing business.
It's important you've got experience working in a high transactional environment and have an extensive understanding of SQL Server.

Technology Stack:
MS SQL Server
NoSQL - CosmosDB
HDFS - Azure Data Lake
Azure Databricks
C#, .Net
Kafka, RabbitMQ, Event Hubs
PowerBI, Tableau
This is a great opportunity for an experienced Data Engineer to join an incredibly bright team at a crucial time.
If you think you have what it takes, please send your CV now for more information on both the role and company!

Senior Hadoop Architect - Contract

4 days ago
Harnham - London, UK

Senior Hadoop Architect

London

£550 - £650 Per Day

6 Months Contract

One of the world's largest banking and financial services organisations currently serving more than 30 million customers globally with large access to increasingly growing markets. Their strategy builds on these advantages and positions them to capitalise on future trends within the financial services industry as well as allowing them to continuously strengthen and provide an solid foundation for sustainable dividend.

YOUR ROLE AND RESPONSIBILITIES - Senior Hadoop Architect

In this hands-on role you will need extensive experience with Hadoop, Hive and Python
Must be a HDPCA

Duties include:

  • Working with development pods to integrate HWX solutions in support of business requirements and product features.
  • Improving solution design capability within the team.
  • Supporting design and development teams in relation to production and non-production users in environmental/technical issues.
  • Advise on design, development and integration support.
  • Programme and formulate product improvements.

YOUR SKILLS AND EXPERIENCE

To qualify for this Senior Hadoop Architect role, you will need:

  • HDPCA (Essential)
  • Deep understanding of Hadoop, HDFS, Hive, YARN, Tez
  • Educated in Kerberos clusters and experienced in security principals
  • Working knowledge of Spark, Python and Pyspark
  • Experienced in implementing Know Gateways
  • RHEL / REGEX / Shell Scripting
  • Understanding of Cloud Platforms

HOW TO APPLY

To apply, do so via this site. For more information on this role or other data engineering roles, get in touch with Sean at Harnham.

KEYWORDS: HDPCA, Hadoop, Python, Pyspark, Spark

Senior Hadoop Administrator

4 days ago
Harnham - London, UK

Senior Hadoop Administrator
£550 - £650 PER DAY
London
6 Months

Are you looking for an opportunity to set up and configure 5 eco-systems on a Horton works platform? If you an expert in all things Hadoop please apply below!

THE COMPANY:

This is an exciting opportunity to work for a leading financial organisation in their Advanced Analytics team. You will be instrumental in providing support to all the major areas of the bank on a Horton works platform. You will be situated in a modern London office working in an agile team environment.

THE ROLE:

As a Hadoop Administrator, you will be providing L1/2/3 support of production users. You must have a deep knowledge of Hadoop's main components such as HDFS, HBase, Tarn, Pig in order to set up and upgrade the current Hadoop-eco systems. You will be involved in not only the administration behind 5 separate Eco-systems but also in the low level architecture.

In specifc, you can expect to be involving in the following:

  • Clumping together Hortonwork components into an Apache Cluster
  • Working with application teams to support their products

YOUR SKILLS AND EXPERIENCE:

The successful Senior Hadoop Administrator will have the following skills and experience:

  • Be a Hortonworks certified administrator
  • In depth knowledge of the Hadoop eco-system
  • An understanding of L1/2/3 support services
  • Commercial experience with big data technologies such as Spark
  • The ideal candidate will have experience working on cloud based platforms such as GCP

THE BENEFITS:

  • The chance to work in a agile team environment

HOW TO APPLY:
Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

High rate for HDFS in UK

Senior Hadoop Administrator

4 days ago
Harnham - Derbyshire, UK

Senior Hadoop Administrator
£550 - £650 PER DAY
London
6 Months

Are you looking for an opportunity to set up and configure 5 eco-systems on a Horton works platform? If you an expert in all things Hadoop please apply below!

THE COMPANY:

This is an exciting opportunity to work for a leading financial organisation in their Advanced Analytics team. You will be instrumental in providing support to all the major areas of the bank on a Horton works platform. You will be situated in a modern London office working in an agile team environment.

THE ROLE:

As a Hadoop Administrator, you will be providing L1/2/3 support of production users. You must have a deep knowledge of Hadoop's main components such as HDFS, HBase, Tarn, Pig in order to set up and upgrade the current Hadoop-eco systems. You will be involved in not only the administration behind 5 separate Eco-systems but also in the low level architecture.

In specifc, you can expect to be involving in the following:

  • Clumping together Hortonwork components into an Apache Cluster
  • Working with application teams to support their products

YOUR SKILLS AND EXPERIENCE:

The successful Senior Hadoop Administrator will have the following skills and experience:

  • Be a Hortonworks certified administrator
  • In depth knowledge of the Hadoop eco-system
  • An understanding of L1/2/3 support services
  • Commercial experience with big data technologies such as Spark
  • The ideal candidate will have experience working on cloud based platforms such as GCP

THE BENEFITS:

  • The chance to work in a agile team environment

HOW TO APPLY:
Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Ab Initio Developers x 2

5 days ago
Hays Specialist Recruitment Limited - Northamptonshire, UK

2 x contract jobs for Ab Initio Developers who have a good understanding of Hadoop.

Your new company
Is one of the worlds largest System Integration companies

Your new role
As a Senior developer you will be expected to possess excellent Knowledge of the Ab Initio stack and its implementation in Big Data Space (Data Lake). Technical Design and development of ETL/Hadoop and Analytics services /components. Contribute in end to end architecture and process flow. Understand Business requirement and publish reusable designs. Result oriented approach with ability to provide apt solutions. Proficient in performance improvement & fine-tuning ETL and Hadoop implementations. Conduct code reviews across projects. Takes responsibility for ensuring that build and code adhere to architectural and quality standards and policies. Can work independently with minimum supervision. Strong analytical and problem solving skills. Experience/Exposure to SQL, advanced SQL skills

What you'll need to succeed
Relevant years of Hands-on experience with Ab Initio and Hadoop technologies (HDFS, HIVE, Impala, Scala, Spark, PIG etc). Experience in Relational Databases like Oracle, SQL Server and PL/SQL. Understanding of Agile methodologies as well as SDLC life-cycles and processes. Expertise in ETL technology (Ab Initio, Hadoop).

What you'll get in return
A 9 month contract is being offered by the client and you can be based in London or Northampton.

What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now on .

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk

Last 90 Days

Senior Platform Lead

7 days ago
Cititec - London, UK

SENIOR PLATFORM ENGINEER

6 MONTH CONTRACT

LONDON

£550 PD

Role:

My client is searching for a Senior Platform Engineer, your role will be responsible for all aspects of platform design, definition, implementation, operation, support and maintenance. The role will combine elements of platform engineering, build, deployment and operational support across our on premise and cloud estate.

Key Responsibilities

  • Work with a team of platform engineers to deliver high quality platform solutions to plan
  • Provide oversight and guidance on design and implementation to other team members
  • Work with the project manager to estimate and plan deliverables.
  • Work with architects to review and challenge designs and deliver supportable platform solutions
  • Work with operations manager to implement appropriate operational procedures
  • Ensure a strong commitment from the team to continuous improvement of platform development, process and operations
  • Ensure SDLC processes are followed for the platform including automation of build, deploy and test processes
  • Accountable for ensuring platform and process documentation is appropriate and up to date
  • Accountable for production of delivery artefacts and appropriate quality assurances processes are followed
  • Accountable for ensuring environments are available so as to support business and development timescales

Essential Skills / Experience:

  • Experience of Off-the-shelf package implementation and support
  • Experience of bespoke application build implementation and support
  • Cloud (AWS) application and infrastructure services: VPC, EC2, S3, IAM
  • Development and maintenance of build scripts and use of Jenkins for build automation
  • Unix shell scripting in bash / ksh; Windows scripting
  • Python development
  • Ansible/Chef/Puppet or similar deployment scripting language
  • Experience of unit test automation
  • Experience of continuous integration/deployment and development techniques
  • Exposure to Scrum/Agile delivery methodologies

Desirable Skills / Experience:

  • Application development experience using Java or Python
  • Package integration experience, connecting solutions to source data repositories
  • Understanding of Active Directory, Windows authentication and Kerberos
  • Appreciation of Hadoop Big Data technologies: HDFS, YARN, SQOOP, Hive, Impala and general cluster management (Cloudera)
  • Powershell development
  • Familiarity with ETL concepts and solutions
  • Familiarity with scheduling and orchestration tools
  • Familiarity with data extraction from 3rd party APIs

If you have the relevant skill set and interested in this position, please either email me your updated CV or call 0207 608 5822

Keywords: platform engineer / sdlc / python / Hadoop / Big data / java / unix / Jenkins / Ansible / Chef / Puppet / scrum / agile / platform engineer / sdlc / python / Hadoop / Big data / java / unix / Jenkins / Ansible / Chef / Puppet / scrum / agile / platform engineer / sdlc / python / Hadoop / Big data / java / unix / Jenkins / Ansible / Chef / Puppet / scrum / agile /

Big Data Developer

10 days ago
Outsource UK - Northampton, UK

The successful Big Data Developer Professional will work within a team of highly skilled technical experts who support key business requirements for the bank using cutting edge technologies in Big Data Technology Stack.

Key skills required for the Big Data Developer - Banking

  • Expertise coding in Java or Scala
  • Experience with multiple open source tool sets in the Big Data space.
  • Experience with both traditional waterfall and agile release methodologies.
  • Experience in maintaining, optimization and issue resolution of Hadoop clusters, supporting Business users and Batch/Streaming processes.
  • Experience configuring and setting up Hadoop clusters and provide support for - aggregation, lookup & fact table creation criteria's, Map Reduce tuning, Spark job tuning, data node setup, NN recovery, HA, Sentry security, etc.
  • Experience in Linux / Unix OS Services, Administration, Shell, awk scripting.
  • Experience in building and scalable applications for Hadoop ecosystem.
  • Experience in Core Java, CLI tools, Mesos or Yarn, Spark, Hadoop ecosystem (Map Reduce, Hive, Pig, HDFS, H-Catalog, Beeline, Zookeeper, Oozie, Hbase, Flume and Kafka).
  • Hands-on Experience in SQL (Oracle Pl/SQL) and No SQL Databases (HBASE/Cassandra/Mongo DB).
  • Experience in building large scale real-world backend and middle-tier systems in Java and Hadoop ecosystems.
  • Experience in tool Integration, automation, configuration management in GIT, Nexus, Jira platforms
  • Excellent oral and written communication and presentation skills, analytical and problem solving skills
  • Experience in tool Integration, automation, configuration management in GIT, Jira platforms

If you are a Big Data Developer looking for a new contract either apply online or if would like to find out about other IT/Financial Services opportunities please contact Jamie Rogers on [email protected] or 01793 430021.

High rate for HDFS in London, UK

Big Data Developer

10 days ago
Haybrook - London, UK
Big Data Developer, Hadoop, AWS, Docker, Kubernetes, Agile, Platform Engineer G1/1070

A growing Engineering Client is currently looking for a Big Data Developer to join their team for an initial 6 months contract. Working with an Agile team you will work to automate and build a platform with Hadoop in AWS. The successful Big Data Developer will have a strong technical background and enjoy keeping up with new technological advances.

Roles and Responsibilities of the Big Data Developer:

  • Design and development of the Big Data store
  • Hands on system development and support
  • Contribute to the design, technical governance and product choices for the data store
  • Represent the programme at the technical design authority as require
  • Ensure the designs address all hosting, security and NFR requirements.
  • Ensure the system is able to gain security accreditation.
  • Technical point of authority
  • Advocate of high quality, maintainable software and systems.
  • Liaising with senior stakeholders

Skills and Experience of the Big Data Developer:

  • Proven experience building a platform
  • Hadoop / HDFS
  • AWS
  • HBase, Kafka, Kerberos
  • Ansible, Terraform
  • Solr
  • Agile
  • Postgres
  • Apache Camel, Knox, Ranger
  • Drools
  • Kubernetes
  • Docker

Big Data Developer, Hadoop, AWS, Docker, Kubernetes, Agile, Platform Engineer G1/1070

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.
High rate for HDFS in UK

Big Data Developer, Hadoop, AWS, Docker, Agile, Platform Engineer

10 days ago
Haybrook IT Resourcing Ltd - Croydon, UK

Big Data Developer, Hadoop, AWS, Docker, Kubernetes, Agile, Platform Engineer G1/1070

A growing Engineering Client is currently looking for a Big Data Developer to join their team for an initial 6 months contract. Working with an Agile team you will work to automate and build a platform with Hadoop in AWS. The successful Big Data Developer will have a strong technical background and enjoy keeping up with new technological advances.

Roles and Responsibilities of the Big Data Developer:

  • Design and development of the Big Data store
  • Hands on system development and support
  • Contribute to the design, technical governance and product choices for the data store
  • Represent the programme at the technical design authority as require
  • Ensure the designs address all hosting, security and NFR requirements.
  • Ensure the system is able to gain security accreditation.
  • Technical point of authority
  • Advocate of high quality, maintainable software and systems.
  • Liaising with senior stakeholders

Skills and Experience of the Big Data Developer:

  • Proven experience building a platform
  • Hadoop / HDFS
  • AWS
  • HBase, Kafka, Kerberos
  • Ansible, Terraform
  • Solr
  • Agile
  • Postgres
  • Apache Camel, Knox, Ranger
  • Drools
  • Kubernetes
  • Docker

Big Data Developer, Hadoop, AWS, Docker, Kubernetes, Agile, Platform Engineer G1/1070

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.

Data Scientist

10 days ago
Newcross Healthcare - London, UK

Newcross Healthcare's journey began over 20 years ago, since then we have innovated, grown and developed to become one of the UK's most successful healthcare staffing providers, leading the way within our sector.

We've come far but our ambitions are big and we're now undergoing a fundamental change in the way that we conduct our business, putting data at the heart of our decision making. We are looking for an experienced and talented Data Scientist to organise and analyse the large amounts of company data we produce in order to draw actionable insight and drive commercial decisions.

Job title: Data Scientist - Contract (3 months initially with possible extension)

Reporting to: Chief Technology Officer

Location: London - Brand new office at WeWork South Bank Central - with free coffee & beer!

Hours of work: Monday to Friday, 37.5 hours per week

Package: £400 - £500 per day

This is a unique and interesting role working as part of a small, agile team to conceive, build and iterate on a range of exciting projects. Using data to drive new solutions and exciting technologies from the ground up.

Some projects the team are currently working on include:

  • We're building a modern digital platform where data is key
  • We're talking directly to users to understand their pain-points
  • We're using analytics to identify opportunities for the business to grow
  • We're designing and building mobile apps, tablet apps and web apps
  • We are using a range of new tools and technologies
  • We're creating systems to ensure consistency and to ensure we can move quickly as we grow

The role….

  • Working with the CTO and other business functions to identify opportunities for leveraging company data to draw actionable insight and drive business solutions.
  • Analysing internal data to identify business issues and drive the development team to implement meaningful features.
  • Assess the effectiveness and accuracy of new data sources and data gathering techniques.
  • Develop algorithms and data models to apply across solutions across the business.
  • Create processes and tools to monitor and analyse model performance and data accuracy.

You….

If you are a positive and driven person who enjoys working collaboratively then you will fit in great with our team. You will also need skills/ experience within a number of these key areas:

  • Experience of using Machine learning techniques (decision tree learning, clustering, regression…).
  • Experience with distributed data/computing tools: Elasticsearch is essential. Experience with Hadoop, HDFS or Spark would be a plus…
  • Experience with databases such as SQL Server or BigQuery.
  • Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting, Trees, text mining, Big Data extracts.
  • Experience analysing data from 3rd party providers: Google Analytics, Google Firebase, Tableau…
  • Strong problem-solving skills with an emphasis on product development.
  • Experience using statistical computer languages (R, Python, Javascript, ) to manipulate data and create reports/insights from large data sets.
  • A drive to learn and master new technologies and techniques.
  • Maths or statistics Master degree or equivalent. A PhD would be a plus.
  • Previous experiences with manipulating and transforming large set of data.

Java Developer - Spring - AWS

11 days ago
Proactive Appointments - Glasgow, UK

Java Developer - Spring - AWS 

Java Developer - Spring - AWS - Glasgow - Our client a multinational services company is looking for a Java Developer with a strong Java Development background with an emphasis on Spring and AWS, if you have a blend of the following skills please forward your cv in the first instance:-

  • Core JAVA, Web Services, SPRING Framework (must have Spring Boot), Oracle SQL with Performance knowledge.
  • Proven Data Processing skillset with experience in HDP tools and techniques. 
    • Kafka Real-time messaging
    • Spark
    • HBase modeling and development
    • Spark processing and performance tuning
    • HDFS file formats partitioning for eg; Parquet, Avro etc;
    • Impala/Hive
    • Unix Shell Scripting
    • Proficiency in Scala
    • Working proficiency in developmental toolsets like Eclipse, IntelliJ
    • Exposure/competence with an Agile Development approach
    • Solid experience utilizing Source code control software (e.g. GIT, Subversion)
    • Multi-threaded Programming
    • Jenkins/Maven
    • FindBugs, Sonar, JUNIT, Performance, Memory Management
  • Excellent commutation skill
  • Good understanding of software development life cycle

Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. 

Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation

We take our obligations to protect your personal data very seriously.  Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website http://proactive.it/privacy-notice/

Big Data Developer/ Modeller

21 days ago
TXM Recruit Ltd - Edinburgh, UK

Big Data Developer/Modeller

A great opportunity for an experienced to Big Data Developer/Modeller to join an established financial services organisation in Edinburgh with facilities to work regularly from home. A rolling contract, looking for someone to start ASAP.

Requirements:

In depth understanding of big data processing technologies -

Hadoop/Spark/Hive/MapReduce/Impala/Kafka etc.

Programming experience - Java, Python, SQL etc.

Provide project guidance/consultancy on limitations and capabilities, performance etc

Undertake reverse engineering of physical data models

Analyse data-related system integration issues and propose appropriate solutions

Undertake query performance analysis, provide guidance and feedback into data model change as necessary

Financial Services experience with an aligned standard logical model

Develop best practices in physical modelling

Analysis of logical data models and creation of appropriate physical data models

Structure data on HDFS for optimal performance and efficiency

Selection of optimal data structures for Hive/Impala processing including normalising/denormalising approach

Understanding of storage options on Hadoop including compression, file formats

Develop partitioning, bucketing and indexing strategy

Hadoop Administrator

22 days ago
Harnham - London, UK

HADOOP ADMINISTRATOR
£550 PER DAY
CENTRAL LONDON

6 MONTHS

Are you looking to work on big data projects on a Hortonworks Data Platform. If you are a Hadoop enthusiast this is the role for you!

THE COMPANY:

As a Hadoop Administrator you will have the chance to work for one of the major financial organisations which has a customer base of over 30 million. You will be working in the Financial Crime Risk department leading several projects within Hortonworks. You will be working as part of a team to lead the implement and stabilisation of Hadoop eco-systems.

THE ROLE:

You will be working with product owners to implement HWX solutions to support the businesses requirements. As a Hadoop administrator you will also be providing L1/2/3 support of production users. As the implementation of Hadoop eco systems will be a large part of the role you will need to have a deep knowledge of the main components such as HDFS, HUIVE, HBase and Yarn.

In specific, you can expect to be involved in the following:

  • Leading the team in implementing Hadoop eco systems
  • Working with product owners and development pods to create solutions to business requirements

YOUR SKILLS AND EXPERIENCE:

The successful Hadoop Administrator will have the following skills and experience:

  • Deep knowledge of the Hadoop eco system
  • Good working knowledge of Spark
  • Exposure to cloud-based platforms such as GCP
  • An understanding of L1/2/3 support services

THE BENEFITS:

  • The opportunity to lead a team in big data projects for a Global Bank's Hortonworks platform
  • The chance to use the latest cloud technologies

HOW TO APPLY:
Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.