Impala Contract Jobs

High rate for Impala in UK

Senior Big Data Engineer

4 days ago
Harnham - Cambridge, UK

Senior Big Data Engineer
Cambridge
6-month Contract
£600 per day

As a Senior Big Data Engineer, you will be using cutting edge Big Data technologies to deliver value for a leading retailer.

THE COMPANY:
This company are a well-known retailer who have a huge global presence and customer base. Due to their wide customer base they have a strong focus on how using Big Data tools can help generate value for their business. You will be situated in agile team helping to deliver big data projects based on business requirements.

THE ROLE:
As a Senior Big Data Engineer, you will be using streaming technologies like Kafka and Spark for real-time purposes. It is essential that you have a good understanding of the Hadoop eco-system like Storm, HBase, Hive and Impala as you will be working in a Hadoop environment. As a Senior Data Engineer, you will be responsible for mentoring team members in delivering best practices. You will be involved in developing and maintaining critical data systems within the company.

YOUR SKILLS AND EXPERIENCE:
The ideal Senior Big Data Engineer will have:

  • Worked with streaming technologies (Spark/Kafka)
  • Strong coding and design skills in Scala
  • Exposure to Hadoop, Hive, HBase, Storm
  • Previous experience within the software industry
  • Previous experience communicating with stakeholders

HOW TO APPLY:
Please register your interest by sending your CV via the Apply link on this page.

High rate for Impala in UK

Senior Big Data Engineer

4 days ago
Harnham - Cambridge, UK

Senior Big Data Engineer
Cambridge
6-month Contract
£600 per day

As a Senior Big Data Engineer, you will be using cutting edge Big Data technologies to deliver value for a leading retailer.

THE COMPANY:
This company are a well-known retailer who have a huge global presence and customer base. Due to their wide customer base they have a strong focus on how using Big Data tools can help generate value for their business. You will be situated in agile team helping to deliver big data projects based on business requirements.

THE ROLE:
As a Senior Big Data Engineer, you will be using streaming technologies like Kafka and Spark for real-time purposes. It is essential that you have a good understanding of the Hadoop eco-system like Storm, HBase, Hive and Impala as you will be working in a Hadoop environment. As a Senior Data Engineer, you will be responsible for mentoring team members in delivering best practices. You will be involved in developing and maintaining critical data systems within the company.

YOUR SKILLS AND EXPERIENCE:
The ideal Senior Big Data Engineer will have:

  • Worked with streaming technologies (Spark/Kafka)
  • Strong coding and design skills in Scala
  • Exposure to Hadoop, Hive, HBase, Storm
  • Previous experience within the software industry
  • Previous experience communicating with stakeholders

HOW TO APPLY:
Please register your interest by sending your CV via the Apply link on this page.

Lead Data Engineer - Google Cloud Platform

5 days ago
REED Technology - Birmingham, UK
This is an exciting opportunity for Lead Engineers to join a market leading business, building Data Intelligence solutions on the Google Cloud Platform on a contract basis. Our client is operating at the forefront of Google Cloud Data solutions, proving the capabilities of this exciting new platform in the Data and Analytics world. Successful candidates will gain unprecedented experience of this exciting technology and will benefit from formal training and certification in Google Cloud technologies.

Responsibilities include:


• Implement and deploy big data processing workflows
• Design and build cloud-scale Services and API’s
• Handle all aspects of development – design, development, build, deployment, monitoring and operations.
• Research and experiment with emerging technologies and industry trends with a view to bringing business value through early adoption
• Work in an agile, cross-functional team composed of both engineers and data scientists, taking responsibility for delivering predictive data products

Required skills & experiences include:

• BS degree in a technical or engineering field or equivalent practical experience.
• Expert knowledge in Java and/or Python
• Good Experience with database design and querying (e.g. SQL, MySQL, PostgreSQL, Big Query).
• Strong experience (at least 2 big projects) with Hadoop ecosystem (Spark, Hive/Impala, HBase, Yarn)
• Good Experience with Unix-based systems, including bash scripting
• Good Experience with columnar data formats and NoSQL document database
• Experience with other distributed technologies such as Cassandra, Splunk, Solr/ElasticSearch, Flink, Heron, Beam, would also be desirable
• Strong experience with Lambda Architectures

Additional Desirable Skills:

• Experience in Google Cloud Platform and/or other public cloud products.
• Technical experience in Web technologies such as HTML, XML, JSON, OAuth 2 along with experience in analysis of relational data in MySQL, Google BigQuery or similar.
• Knowledge of solution architecture within web and mobile environments, web/internet related technologies, architecture across Software-as-a-Service, Platform-as-a-Service, Infrastructure-as-a-Service (SAAS, PAAS, IAAS) and cloud productivity suites.

Reed Specialist Recruitment Limited is an employment agency and employment business

Ab Initio Developers x 2

5 days ago
Hays Specialist Recruitment Limited - Northamptonshire, UK

2 x contract jobs for Ab Initio Developers who have a good understanding of Hadoop.

Your new company
Is one of the worlds largest System Integration companies

Your new role
As a Senior developer you will be expected to possess excellent Knowledge of the Ab Initio stack and its implementation in Big Data Space (Data Lake). Technical Design and development of ETL/Hadoop and Analytics services /components. Contribute in end to end architecture and process flow. Understand Business requirement and publish reusable designs. Result oriented approach with ability to provide apt solutions. Proficient in performance improvement & fine-tuning ETL and Hadoop implementations. Conduct code reviews across projects. Takes responsibility for ensuring that build and code adhere to architectural and quality standards and policies. Can work independently with minimum supervision. Strong analytical and problem solving skills. Experience/Exposure to SQL, advanced SQL skills

What you'll need to succeed
Relevant years of Hands-on experience with Ab Initio and Hadoop technologies (HDFS, HIVE, Impala, Scala, Spark, PIG etc). Experience in Relational Databases like Oracle, SQL Server and PL/SQL. Understanding of Agile methodologies as well as SDLC life-cycles and processes. Expertise in ETL technology (Ab Initio, Hadoop).

What you'll get in return
A 9 month contract is being offered by the client and you can be based in London or Northampton.

What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now on .

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk

Last 90 Days

Senior Platform Lead

7 days ago
Cititec - London, UK

SENIOR PLATFORM ENGINEER

6 MONTH CONTRACT

LONDON

£550 PD

Role:

My client is searching for a Senior Platform Engineer, your role will be responsible for all aspects of platform design, definition, implementation, operation, support and maintenance. The role will combine elements of platform engineering, build, deployment and operational support across our on premise and cloud estate.

Key Responsibilities

  • Work with a team of platform engineers to deliver high quality platform solutions to plan
  • Provide oversight and guidance on design and implementation to other team members
  • Work with the project manager to estimate and plan deliverables.
  • Work with architects to review and challenge designs and deliver supportable platform solutions
  • Work with operations manager to implement appropriate operational procedures
  • Ensure a strong commitment from the team to continuous improvement of platform development, process and operations
  • Ensure SDLC processes are followed for the platform including automation of build, deploy and test processes
  • Accountable for ensuring platform and process documentation is appropriate and up to date
  • Accountable for production of delivery artefacts and appropriate quality assurances processes are followed
  • Accountable for ensuring environments are available so as to support business and development timescales

Essential Skills / Experience:

  • Experience of Off-the-shelf package implementation and support
  • Experience of bespoke application build implementation and support
  • Cloud (AWS) application and infrastructure services: VPC, EC2, S3, IAM
  • Development and maintenance of build scripts and use of Jenkins for build automation
  • Unix shell scripting in bash / ksh; Windows scripting
  • Python development
  • Ansible/Chef/Puppet or similar deployment scripting language
  • Experience of unit test automation
  • Experience of continuous integration/deployment and development techniques
  • Exposure to Scrum/Agile delivery methodologies

Desirable Skills / Experience:

  • Application development experience using Java or Python
  • Package integration experience, connecting solutions to source data repositories
  • Understanding of Active Directory, Windows authentication and Kerberos
  • Appreciation of Hadoop Big Data technologies: HDFS, YARN, SQOOP, Hive, Impala and general cluster management (Cloudera)
  • Powershell development
  • Familiarity with ETL concepts and solutions
  • Familiarity with scheduling and orchestration tools
  • Familiarity with data extraction from 3rd party APIs

If you have the relevant skill set and interested in this position, please either email me your updated CV or call 0207 608 5822

Keywords: platform engineer / sdlc / python / Hadoop / Big data / java / unix / Jenkins / Ansible / Chef / Puppet / scrum / agile / platform engineer / sdlc / python / Hadoop / Big data / java / unix / Jenkins / Ansible / Chef / Puppet / scrum / agile / platform engineer / sdlc / python / Hadoop / Big data / java / unix / Jenkins / Ansible / Chef / Puppet / scrum / agile /

Java Developer - Spring - AWS

11 days ago
Proactive Appointments - Glasgow, UK

Java Developer - Spring - AWS 

Java Developer - Spring - AWS - Glasgow - Our client a multinational services company is looking for a Java Developer with a strong Java Development background with an emphasis on Spring and AWS, if you have a blend of the following skills please forward your cv in the first instance:-

  • Core JAVA, Web Services, SPRING Framework (must have Spring Boot), Oracle SQL with Performance knowledge.
  • Proven Data Processing skillset with experience in HDP tools and techniques. 
    • Kafka Real-time messaging
    • Spark
    • HBase modeling and development
    • Spark processing and performance tuning
    • HDFS file formats partitioning for eg; Parquet, Avro etc;
    • Impala/Hive
    • Unix Shell Scripting
    • Proficiency in Scala
    • Working proficiency in developmental toolsets like Eclipse, IntelliJ
    • Exposure/competence with an Agile Development approach
    • Solid experience utilizing Source code control software (e.g. GIT, Subversion)
    • Multi-threaded Programming
    • Jenkins/Maven
    • FindBugs, Sonar, JUNIT, Performance, Memory Management
  • Excellent commutation skill
  • Good understanding of software development life cycle

Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. 

Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation

We take our obligations to protect your personal data very seriously.  Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website http://proactive.it/privacy-notice/

High rate for Impala in London, UK

Scala Developer

19 days ago
Harnham - London, UK

SCALA DEVELOPER

£650 PER DAY

6 MONTHS

CENTRAL LONDON

As a Scala Developer you will be working with Big Data technologies to develop and stream batch workloads on Spark!

THE COMPANY:

You will be working for a state-of-the-art tech company who are looking to expand their digital platforms to cater for their ever-growing client base. This tech company has a global presence and is looking to expand further. You will be working as part of an agile team of Scala Developers in a modern office situated in the heart of London.

THE ROLE:

As a Scala Developer, you will be helping to build ETL pipelines. You will be using tools such as Spark for streaming as well as Kafka and Cassandra. You will be working in a Hadoop enviroment and thus must have knowledge of HBase, Hive and Impala. You will be involved in the following:

  • Working in a Hadoop environment using Spark to develop streaming and batch processing
  • Using Scala for programming and coding
  • Using Kafka to help engineer solutions
  • Working with senior stakeholders and architects to further develop the overall solution

YOUR SKILLS AND EXPERIENCE:

The successful Scala Developer will have the following skills and experience:

  • Extensive exposure to Spark streaming
  • Previous experience engineering in a hosted Hadoop cluster
  • Programming in Scala
  • Working with Kafka in a commercial environment
  • The ideal candidate will have exposure to real time systems such as Storm

THE BENEFITS:

This role offers a very competitive rate of up to £700 per day.

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

High rate for Impala in London, UK

Scala Developer

19 days ago
Harnham - London, UK

SCALA DEVELOPER

£650 PER DAY

6 MONTHS

CENTRAL LONDON

As a Scala Developer you will be working with Big Data technologies to develop and stream batch workloads on Spark!

THE COMPANY:

You will be working for a state-of-the-art tech company who are looking to expand their digital platforms to cater for their ever-growing client base. This tech company has a global presence and is looking to expand further. You will be working as part of an agile team of Scala Developers in a modern office situated in the heart of London.

THE ROLE:

As a Scala Developer, you will be helping to build ETL pipelines. You will be using tools such as Spark for streaming as well as Kafka and Cassandra. You will be working in a Hadoop enviroment and thus must have knowledge of HBase, Hive and Impala. You will be involved in the following:

  • Working in a Hadoop environment using Spark to develop streaming and batch processing
  • Using Scala for programming and coding
  • Using Kafka to help engineer solutions
  • Working with senior stakeholders and architects to further develop the overall solution

YOUR SKILLS AND EXPERIENCE:

The successful Scala Developer will have the following skills and experience:

  • Extensive exposure to Spark streaming
  • Previous experience engineering in a hosted Hadoop cluster
  • Programming in Scala
  • Working with Kafka in a commercial environment
  • The ideal candidate will have exposure to real time systems such as Storm

THE BENEFITS:

This role offers a very competitive rate of up to £700 per day.

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Engineer

20 days ago
Harnham - London, UK

Big Data Engineer
Central London
6-month Contract
£650 per day

As a Big Data Engineer, you will be implementing Spark-streaming on a Bank's Hortonworks Hadoop platform. They have the view to move into the cloud which they expect you to assist with.

THE COMPANY:
This company are a well-established Technology Consultancy who work for market-leading clients across all sectors. They are currently undertaking a major project for their biggest client who is a world-leading Technology provider. They are essentially helping to build and deploy a new Big Data platform for them and need an engineer who can give guidance to their onshore team.

THE ROLE:
As a Big Data Engineer, you will be introducing streaming technologies like Kafka, for real-time purposes. This is part of huge digital transformation project that is taking place. You will be writing the code in Scala and querying data in SQL, on Spark. It is also essential that you have a good understanding of the Hadoop eco-system like Storm, HBase, Hive, Impala. This is a long-term project where you will eventually be migrating to the cloud and working with Data Scientists.

YOUR SKILLS AND EXPERIENCE:
The ideal Big Data Engineer will have:

  • Designed a developed a Hadoop cluster in a Cloudera or Hortonworks environment
  • Worked with streaming technologies (Spark/Kafka)
  • Expertise with Scala
  • Exposure to Hadoop, Hive, HBase, Storm

HOW TO APPLY:
Please register your interest by sending your CV via the Apply link on this page.

Big Data Engineer

20 days ago
Harnham - London, UK

Big Data Engineer
Central London
6-month Contract
£650 per day

As a Big Data Engineer, you will be implementing Spark-streaming on a Bank's Hortonworks Hadoop platform. They have the view to move into the cloud which they expect you to assist with.

THE COMPANY:
This company are a well-established Technology Consultancy who work for market-leading clients across all sectors. They are currently undertaking a major project for their biggest client who is a world-leading Technology provider. They are essentially helping to build and deploy a new Big Data platform for them and need an engineer who can give guidance to their onshore team.

THE ROLE:
As a Big Data Engineer, you will be introducing streaming technologies like Kafka, for real-time purposes. This is part of huge digital transformation project that is taking place. You will be writing the code in Scala and querying data in SQL, on Spark. It is also essential that you have a good understanding of the Hadoop eco-system like Storm, HBase, Hive, Impala. This is a long-term project where you will eventually be migrating to the cloud and working with Data Scientists.

YOUR SKILLS AND EXPERIENCE:
The ideal Big Data Engineer will have:

  • Designed a developed a Hadoop cluster in a Cloudera or Hortonworks environment
  • Worked with streaming technologies (Spark/Kafka)
  • Expertise with Scala
  • Exposure to Hadoop, Hive, HBase, Storm

HOW TO APPLY:
Please register your interest by sending your CV via the Apply link on this page.

Big Data Developer/ Modeller

21 days ago
TXM Recruit Ltd - Edinburgh, UK

Big Data Developer/Modeller

A great opportunity for an experienced to Big Data Developer/Modeller to join an established financial services organisation in Edinburgh with facilities to work regularly from home. A rolling contract, looking for someone to start ASAP.

Requirements:

In depth understanding of big data processing technologies -

Hadoop/Spark/Hive/MapReduce/Impala/Kafka etc.

Programming experience - Java, Python, SQL etc.

Provide project guidance/consultancy on limitations and capabilities, performance etc

Undertake reverse engineering of physical data models

Analyse data-related system integration issues and propose appropriate solutions

Undertake query performance analysis, provide guidance and feedback into data model change as necessary

Financial Services experience with an aligned standard logical model

Develop best practices in physical modelling

Analysis of logical data models and creation of appropriate physical data models

Structure data on HDFS for optimal performance and efficiency

Selection of optimal data structures for Hive/Impala processing including normalising/denormalising approach

Understanding of storage options on Hadoop including compression, file formats

Develop partitioning, bucketing and indexing strategy

Python Developer (Scala Hadoop)

1 month ago
Fairfield Consultancy Services Ltd - Glasgow, UK

Responsibilities

Write effective, scalable Object-oriented code.

Develop back-end components to improve responsiveness and overall performance

Integrate with other systems through APIs.

Create Python Unit tests and debug programs

Implement security and data protection solutions

Assess and prioritize feature requests

Coordinate with internal teams to understand user requirements and provide technical solutions.

Requirements

8+ years of work experience as Hadoop, Scala, Core Python Developer and Python scripting

Expertise in at least one popular Python framework (like Django, Flask or Pyramid)

Knowledge of object-relational mapping (ORM)

Experience in front-end technologies preferably AngularJS or any JavaScript Framework.

Good problem-solving and communication skills.

Good working knowledge in RDBMS.

Good knowledge on Hadoop Ecosystem HDFS, MapReduce, Hive, Impala, Spark Core, Streaming, Frames, MlLib (Usage)

Willingness to learn new tools with minimal guidance.

Knowledge of NoSQL such as Cassandra, MongoDB, CouchDB

Working experience in scripts using UNIX.

An understanding of COTS tools for archiving old data.

Following are "Nice to Have"

Experience using R and Python to manipulate data and use it as training data for models.

Experience in working and creating data/ML environments.

Knowledge of python packages like Panda, NumPy, SciKit and deep learning.

Knowledge of Machine learning algorithms such as Clustering, Decision tree learning, NLP etc.- Statistical techniques - regression, properties or distribution, statistical tests etc.

Knowledge of cloud computing using AWS / Azure.

Big Data Engineer

1 month ago
Hays Specialist Recruitment Limited - Leeds, UK

BIG DATA / BUSINESS INTELLIGENCE / DATA SCIENCE / CLOUD PLATFORMS / HADOOP / KAFKA / AGILE / TESTING / SQL

BIG DATA ENGINEER
Location - Leeds
Duration - 6 Months
Rate - Up to £550.00 Per Day

Your new role
A Big Data Engineer is required to join our Clients Business Intelligence Team. The successful Big Data Engineer will be working on the organisations internal Data Platforms used for MI and Analytics providing new capability leveraging Cloud & Big Data technology.

What you'll need to succeed
This is a crucial appointment for our client and the Big Data Engineer will be helping to shape and develop the next generation of data platforms used across the organisation. Therefore the selection process will require Big Data Engineers to have demonstrable experience in some, or ideally all, of the following areas;

  • Cloud Platform Development (Azure, AWS and Google Cloud Platform)
  • Cloud Based Big Data\Data Warehouse Solutions (Snowflake preferred) - including design, development, setup, configuration and monitoring of solutions running on these platforms
  • Strong software development experience in SQL, Python and Scala
  • Experience with Hadoop ecosystems (Spark, Hive/Impala) - including design, development, setup, configuration and monitoring of solutions running on these platforms
  • Kafka (for both Real-time data pipelines and Stream Analytics - including design, development, setup, configuration and monitoring of solutions running on this platform
  • Experience of Testing and Automation processes associated with Big Data solution development.


What you'll get in return
The Day Rate is up to £550.00 for this Vacancy.

What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you but you are looking for a new position, please contact us for a confidential discussion on your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk

Contract Big Data Engineer - Hadoop, Spark, Scala, AWS, Azure

1 month ago
Fruition IT - Leeds, UK

Big Data Engineer, Hadoop, Spark, Scala, Snowflake, AWS, Azure

Leeds, Contract

Up to £500/day

Fruition IT are recruiting a Contract Big Data Engineer for a market leading eCommerce business who are a Yorkshire success story.

This is an exciting opportunity to help develop and shape the next generation of data platforms used for MI and analytics providing new capability leveraging cloud and big data technology. You'll also be working closely with the Data Science team to develop robust solutions.

What you need:

  • Cloud Platform Development (Azure, AWS and Google Cloud Platform)
  • Cloud Based Big Data\Data Warehouse Solutions (Snowflake preferred) - including design, development, setup, configuration and monitoring of solutions running on these platforms
  • Strong software development experience in SQL, Python and Scala
  • Experience with Hadoop ecosystems (Spark, Hive/Impala) - including design, development, setup, configuration and monitoring of solutions running on these platforms
  • Kafka (for both Real-time data pipelines and Stream Analytics - including design, development, setup, configuration and monitoring of solutions running on this platform
  • Experience of Worked in an Agile team producing frequent deliverables
  • Experience of Testing and Automation processes associated with Big Data solution development
  • Any experience the Dataiku Data Science platform or the Tableau Data Visualisation platform would be beneficial but not essential as would any experience or the R programming language.

This is a 6 months contract offering up to £500/day.

To apply for this contract Big Data Engineer role, please send your CV for consideration.

High rate for Impala in London, UK

Platform / DevOps Engineer - London

1 month ago
Search BI - London, UK

Platform / DevOps Engineer - London - 6 month contract

£650-700 per day

The role

As Platform / DevOps Engineer you will support all levels of our client's environments', ensuring availability, stability and security whilst also facilitating data requirements. You'll play a key role defining monitoring frameworks and ensuring implementation across the entire infrastructure as well as helping with the integration of external applications and dependencies. Crucially you will provide analysis of current capacity and future growth to deliver to our client's overall infrastructure capacity planning.

Tasks and Responsibilities

  • Configure a CI/CD pipeline with Jenkins for a data and analytics delivery (Talend, Cloudera, Tableau)
  • Provision AWS Infrastructure components (VPC, ELB, Subnet) using Cloudformation
  • Tableau application performance monitor configuration and integration of Apache Impala
  • Deploy Cloudera clusters within an AWS environment
  • Use Splunk and Data Dog for monitoring and application logging
  • Kerberos integration with IPA and MDS for Cloudera and Tableau
  • Automate AWS services and resources using Chef

Experience Required

  • Experience building and deploying solutions to AWS
  • Clustering and load balancing (Hadoop Cloudera)
  • Worked with CI/CD in a scaled Agile environment
  • Experience of Talend orchestration
  • Experience of developing enterprise grade ETL/data pipelines with tools like Informatica and Talend
  • Experience using development and deployment technologies, for instance virtualisation and management (Tableau), continuous integration tools (Jenkins)
  • Ability to collaborate with development teams

For more information on this position and other Business Intelligence and Data Analytics roles please visit our website www.searchbi.co.uk

SearchBI are the first choice for Business Intelligence and Agile BI recruitment business in the UK and internationally. We work with Software Houses, Partners and End Users throughout the UK and Europe. As unique specialists in this niche marketplace we have positioned ourselves ideally to satisfy the needs of a variety of clients. We understand our industry and use our expansive network to position ourselves as market leaders for all Agile Business Intelligence requirements.

Keywords - Platform Engineer / DevOps Engineer / Data Engineer / AWS Architect / Solutions Architect / Technical Architect / Cloud Architect / DevOps Consultant