Kafka Contract Jobs

Big Data Engineer

4 hours ago
Opus Recruitment Solutions - London, UK
Big Data Engineer|ASAP START|450-550 PER DAY|LONDON BASED

i am working with a leading retail company who are looking for an experienced Big Data Engineer looking to come in and hit the ground running, creating new platforms and developments.

Candidate expectations must have 5 to 10 years experience.

Commercial experience needed.

Skills:

  • Python
  • AWS
  • Hadoop
  • Spark or Kafka
  • NOSQL

Big Data Engineer|ASAP START|450-550 PER DAY|LONDON BASED

Python Data Engineer

5 hours ago
Harnham - London, UK

Python Data Engineer
£550 per day
6 Months
Central London

Are you looking for an opportunity to work in an agile team of Data Scientists and Engineers in building robust ETL pipelines in Python? If you are a talented Python Engineer who enjoys productionising algorithms, please apply!

THE COMPANY:

You will be working for a leading media consultancy who have provided high profile clients with award winning ad campaigns. The company specialises in providing clients with key insights into how best to target consumers through data driven social media campaigns. The project you will be working on is for a luxury clothing brand and you will be working in an agile team environment alongside engineers and data scientists.

THE ROLE:

As a Python Data Engineer, you will be involved in bringing the clients data onto an on-premise solution and productionising algorithms. You will be involved in develop new data platforms and maintain existing software. You must have experience with designing data lakes within AWS S3 and have exposure to its main components. You must have experience within big data technologies such as Hadoop, Hive and Kafka. As part of this project you will be helping to build Proof of Concepts.

  • Working in an AWS environment to help external teams use the data platform
  • Using Python for high level coding
  • Helping to engineer the legacy server onto a Cloud platform

YOUR SKILLS AND EXPERIENCE:

The successful Senior Big Data Engineer will have the following skills and experience:

  • Exposure to Spark and Scala
  • Great knowledge of AWS
  • Extensive experience programming in Python
  • Good exposure to testing practices
  • Hands-on experience with handling large sets of data - structured and unstructured.

THE BENEFITS:

This role offers a very competitive rate of £550 per day.

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Remote contract

Senior Backend Engineer (Remote)

Promoted
Hydrate Solutions - London, UK

Role: Sr. Backend Engineer (4)

Location: Remote, Quarterly onsite in Copenhagen, Denmark 

Expected Duration: 6 month contract

Rate: $95-105/hr USD based on experience. GBP day rate advertised is based on 7.5 hr days and subject to exchange rate.

Hydrate is a rapidly growing, fully distributed consultancy hyper-focused on accelerating digital transformations in the enterprise.  We are looking for exceptional polyglot backend software engineers that are comfortable with digging into legacy code to solve complex challenges and who thrive in leading teams to deliver bold new features, leveraging cutting-edge technology solutions.

Experience

We are looking for someone with demonstrated experience in some of the following:

  • Several back-end programming languages and frameworks
  • .NET Core
  • Docker
  • Messaging buses / queues such as Kafka, RabbitMQ, ActiveMQ, MSMQ etc.
  • Distributed caching frameworks such as Aerospike, Memcached, etc

The ideal candidate will be self-motivated, possess excellent communication skills (both oral and written) and be able to work independently.

Benefits

  • Competitive pay
  • The chance to work with some of the best engineers in the business
  • Ability to not simply work on, but drive high-value, highly visible projects for some of the largest companies in the world
  • A leadership team focused on cutting the BS and politics, positioning our staff to do what they do best - deliver value to our clients.

Python Data Engineer

6 hours ago
Harnham - London, UK

Python Data Engineer
£550 per day
6 Months
Central London

Are you looking for an opportunity to work in an agile team of Data Scientists and Engineers in building robust ETL pipelines in Python? If you are a talented Python Engineer who enjoys productionising algorithms, please apply!

THE COMPANY:

You will be working for a leading media consultancy who have provided high profile clients with award winning ad campaigns. The company specialises in providing clients with key insights into how best to target consumers through data driven social media campaigns. The project you will be working on is for a luxury clothing brand and you will be working in an agile team environment alongside engineers and data scientists.

THE ROLE:

As a Python Data Engineer, you will be involved in bringing the clients data onto an on-premise solution and productionising algorithms. You will be involved in develop new data platforms and maintain existing software. You must have experience with designing data lakes within AWS S3 and have exposure to its main components. You must have experience within big data technologies such as Hadoop, Hive and Kafka. As part of this project you will be helping to build Proof of Concepts.

  • Working in an AWS environment to help external teams use the data platform
  • Using Python for high level coding
  • Helping to engineer the legacy server onto a Cloud platform

YOUR SKILLS AND EXPERIENCE:

The successful Senior Big Data Engineer will have the following skills and experience:

  • Exposure to Spark and Scala
  • Great knowledge of AWS
  • Extensive experience programming in Python
  • Good exposure to testing practices
  • Hands-on experience with handling large sets of data - structured and unstructured.

THE BENEFITS:

This role offers a very competitive rate of £550 per day.

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Engineer

7 hours ago
Opus Recruitment Solutions Ltd - London, UK
Big Data Engineer|ASAP START|450-550 PER DAY|LONDON BASED

i am working with a leading retail company who are looking for an experienced Big Data Engineer looking to come in and hit the ground running, creating new platforms and developments.

Candidate expectations must have 5 to 10 years experience.

Commercial experience needed.

Skills:

- Python
- AWS
- Hadoop
- Spark or Kafka
- NOSQL

Big Data Engineer|ASAP START|450-550 PER DAY|LONDON BASED

DevOps Engineer with Big Data Experience

3 days ago
SQ Computer Personnel Limited - London, UK

DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Terraform, Kafka G1/1074

DevOps Engineer is required for a 2 month rolling contract. The DevOps Engineer will join a growing engineering client who are working closely with clients to provide a secure batch real time analytical platform. Working with Data Science, Machine Learning, Advance Mathematics to create entity matching solutions, providing security application. The DevOps Engineer will work as part of a team to build automated deployments, configure management technology, and deploy new models. The DevOps Engineer will have strong knowledge of Big Data and experience with Big Data Technologies like Hadoop, Kafka and HortonWorks. You will have strong Cloud experience and working knowledge of Docker and Kubernetes.

Responsibilities of the DevOps Engineer:
  • Build, test and deploy cloud-based infrastructure and applications
  • Codes test automation frameworks.
  • Performs script maintenance/updates
  • Evaluates existing applications and platforms and provides recommendations for improving
  • Participates in developing contingency plans
  • Provides technical guidance or system process expertise.
  • Documents functions and changes to new or modified modules and test activities/results.
Skills and experience of the DevOps Engineer:
  • Experience developing backend components and handling DevOps functions for complex systems
  • Open source and commercial software within a cloud-centric environment
  • Hadoop (ideally HortonWorks),
  • Kafka
  • Experienced building and operating logging technologies such as ELK Stack (Elastic Search, LogStash, Kibana)
  • Experienced with CI/CD and Infrastructure-as-Code technologies
  • Jenkins, Artifactory
  • Terraform, and Ansible
  • Docker and Kubernetes
  • Experienced with security, identity and access management technologies
  • Experienced in building, deploying and running infrastructure
  • Cloud experience ideally with AWS

DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Terraform, Kafka G1/1074

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.

DevOps Engineer

3 days ago
Haybrook IT Resourcing - London, UK
DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Terraform, Kafka G1/1074

DevOps Engineer is required for a 2 month rolling contract. The DevOps Engineer will join a growing engineering client who are working closely with clients to provide a secure batch real time analytical platform. Working with Data Science, Machine Learning, Advance Mathematics to create entity matching solutions, providing security application. The DevOps Engineer will work as part of a team to build automated deployments, configure management technology, and deploy new models. The DevOps Engineer will have strong knowledge of Big Data and experience with Big Data Technologies like Hadoop, Kafka and HortonWorks. You will have strong Cloud experience and working knowledge of Docker and Kubernetes.

Responsibilities of the DevOps Engineer:
  • Build, test and deploy cloud-based infrastructure and applications
  • Codes test automation frameworks.
  • Performs script maintenance/updates
  • Evaluates existing applications and platforms and provides recommendations for improving
  • Participates in developing contingency plans
  • Provides technical guidance or system process expertise.
  • Documents functions and changes to new or modified modules and test activities/results.

Skills and experience of the DevOps Engineer:
  • Experience developing backend components and handling DevOps functions for complex systems
  • Open source and commercial software within a cloud-centric environment
  • Hadoop (ideally HortonWorks),
  • Kafka
  • Experienced building and operating logging technologies such as ELK Stack (Elastic Search, LogStash, Kibana)
  • Experienced with CI/CD and Infrastructure-as-Code technologies
  • Jenkins, Artifactory
  • Terraform, and Ansible
  • Docker and Kubernetes
  • Experienced with security, identity and access management technologies
  • Experienced in building, deploying and running infrastructure
  • Cloud experience ideally with AWS


DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Terraform, Kafka G1/1074

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.
High rate for Kafka in UK

DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Kafka

3 days ago
Haybrook IT Resourcing Ltd - Croydon, UK

DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Terraform, Kafka G1/1074

DevOps Engineer is required for a 2 month rolling contract. The DevOps Engineer will join a growing engineering client who are working closely with clients to provide a secure batch real time analytical platform. Working with Data Science, Machine Learning, Advance Mathematics to create entity matching solutions, providing security application. The DevOps Engineer will work as part of a team to build automated deployments, configure management technology, and deploy new models. The DevOps Engineer will have strong knowledge of Big Data and experience with Big Data Technologies like Hadoop, Kafka and HortonWorks. You will have strong Cloud experience and working knowledge of Docker and Kubernetes.

Responsibilities of the DevOps Engineer:

  • Build, test and deploy cloud-based infrastructure and applications
  • Codes test automation frameworks.
  • Performs script maintenance/updates
  • Evaluates existing applications and platforms and provides recommendations for improving
  • Participates in developing contingency plans
  • Provides technical guidance or system process expertise.
  • Documents functions and changes to new or modified modules and test activities/results.

Skills and experience of the DevOps Engineer:

  • Experience developing backend components and handling DevOps functions for complex systems
  • Open source and commercial software within a cloud-centric environment
  • Hadoop (ideally HortonWorks),
  • Kafka
  • Experienced building and operating logging technologies such as ELK Stack (Elastic Search, LogStash, Kibana)
  • Experienced with CI/CD and Infrastructure-as-Code technologies
  • Jenkins, Artifactory
  • Terraform, and Ansible
  • Docker and Kubernetes
  • Experienced with security, identity and access management technologies
  • Experienced in building, deploying and running infrastructure
  • Cloud experience ideally with AWS

DevOps Engineer, Big Data, AWS, Hadoop, Docker, Ansible, Terraform, Kafka G1/1074

Referral Scheme: If this role isn’t for you then perhaps you could recommend a friend or colleague to Haybrook IT. If we go on to place that person in a permanent or temporary capacity then you could be rewarded with £500!! Please see our website for terms and conditions.

Haybrook IT Resourcing is Oxford’s leading IT recruitment agency. With exclusive access to some of the region’s most successful companies, send in your CV today to secure your next IT position.

Haybrook IT Resourcing Ltd acts as an employment agency and an employment business.

We value diversity and always appoint on merit.

Senior DevOps Engineer - £550-£650

3 days ago
Opus Recruitment Solutions Ltd - London, UK
Senior DevOps Engineer – Central London – AWS, Kafka, Kubernetes 

I have partnered with a leading utilities company who are looking to bring on a Senior DevOps Engineer to work on automated code builds & deployment and infrastructure as code. You will be setting up AWS Infrastructure and kubernetes from scratch.

I’m looking for an experienced DevOps engineer who has solid experience across: 

- AWS
- Ansible / Packer
- Kafka
- CI/CD – Jenkins / GoCD  
- Terraform 
- Docker / Kubernetes 

Senior DevOps Engineer – Central London – AWS, Kafka, Kubernetes 

If you are interested in the role please apply with your most up to date CV.
High rate for Kafka in UK

Senior Big Data Engineer

4 days ago
Harnham - Cambridge, UK

Senior Big Data Engineer
Cambridge
6-month Contract
£600 per day

As a Senior Big Data Engineer, you will be using cutting edge Big Data technologies to deliver value for a leading retailer.

THE COMPANY:
This company are a well-known retailer who have a huge global presence and customer base. Due to their wide customer base they have a strong focus on how using Big Data tools can help generate value for their business. You will be situated in agile team helping to deliver big data projects based on business requirements.

THE ROLE:
As a Senior Big Data Engineer, you will be using streaming technologies like Kafka and Spark for real-time purposes. It is essential that you have a good understanding of the Hadoop eco-system like Storm, HBase, Hive and Impala as you will be working in a Hadoop environment. As a Senior Data Engineer, you will be responsible for mentoring team members in delivering best practices. You will be involved in developing and maintaining critical data systems within the company.

YOUR SKILLS AND EXPERIENCE:
The ideal Senior Big Data Engineer will have:

  • Worked with streaming technologies (Spark/Kafka)
  • Strong coding and design skills in Scala
  • Exposure to Hadoop, Hive, HBase, Storm
  • Previous experience within the software industry
  • Previous experience communicating with stakeholders

HOW TO APPLY:
Please register your interest by sending your CV via the Apply link on this page.

Data Engineer - Contract

4 days ago
Burns Sheehan - London, UK
Data Engineer
£450 - £500 per day
6 months
SQL - Azure
London
This is an exciting opportunity to join a cross-functional data team who have seen tremendous growth over the past 12 months. You'll be heavily involved in the re-platforming of their data architecture onto Azure and will utilise Microsoft's range of cloud technology.
The team is made up of close to twenty people consisting of Data Engineers, DBAs, and Data Scientists and they've created a culture which focuses on learning with regular demos and workshops being held by various teams.
You'll be responsible for working alongside engineers to develop greenfield Microservices on Azure whilst building ETL and real time data pipelines that meet the needs of a fast growing business.
It's important you've got experience working in a high transactional environment and have an extensive understanding of SQL Server.

Technology Stack:
MS SQL Server
NoSQL - CosmosDB
HDFS - Azure Data Lake
Azure Databricks
C#, .Net
Kafka, RabbitMQ, Event Hubs
PowerBI, Tableau
This is a great opportunity for an experienced Data Engineer to join an incredibly bright team at a crucial time.
If you think you have what it takes, please send your CV now for more information on both the role and company!
High rate for Kafka in UK

Senior Big Data Engineer

4 days ago
Harnham - Cambridge, UK

Senior Big Data Engineer
Cambridge
6-month Contract
£600 per day

As a Senior Big Data Engineer, you will be using cutting edge Big Data technologies to deliver value for a leading retailer.

THE COMPANY:
This company are a well-known retailer who have a huge global presence and customer base. Due to their wide customer base they have a strong focus on how using Big Data tools can help generate value for their business. You will be situated in agile team helping to deliver big data projects based on business requirements.

THE ROLE:
As a Senior Big Data Engineer, you will be using streaming technologies like Kafka and Spark for real-time purposes. It is essential that you have a good understanding of the Hadoop eco-system like Storm, HBase, Hive and Impala as you will be working in a Hadoop environment. As a Senior Data Engineer, you will be responsible for mentoring team members in delivering best practices. You will be involved in developing and maintaining critical data systems within the company.

YOUR SKILLS AND EXPERIENCE:
The ideal Senior Big Data Engineer will have:

  • Worked with streaming technologies (Spark/Kafka)
  • Strong coding and design skills in Scala
  • Exposure to Hadoop, Hive, HBase, Storm
  • Previous experience within the software industry
  • Previous experience communicating with stakeholders

HOW TO APPLY:
Please register your interest by sending your CV via the Apply link on this page.

Big Data Engineer

4 days ago
Cornwaliis Elt - London, UK

Data Engineer – Hadoop, AWS, Amazon Web Services, Big Data, Spark, Storm, Investment Manager, Asset Manager, Finance

A world-leading investment manager based in the City are looking to hire a small team of Data Engineers to build a brand-new AWS based data-lake. The team will work in conjunction with their digital department to provide a new big data analytics service for their retail investment platform.

The role is to establish Big Data platforms on an AWS infrastructure to provide analytics services. As such you will have experience across a number of Big Data technologies e.g. Hadoop, Kafka, Storm etc.

Key Requirements:
  • Strong knowledge and experience in AWS Stack (Kinesis, Lambda, EMR, S3, DynamoDB, RDS), Attunity
  • Experience in Data Model and Business intelligence
  • Good communication skills

Key Responsibilities:
  • Design, develop and deliver scalable and automated data products
  • Create frameworks for ingestion both real time and batch
  • Create frameworks and components for Data
  • Data Modeling for new Data Products
  • Create Data Catalogs for all entities on boarded and for all entities exposed for distribution
  • Create Re-con process for all Data Pipelines built
  • Leverage IAM roles & policies for service authentication
  • Documentation of functional logic of services
  • Testing (Functional & non-functional) of the pipeline
  • Deploy data products on cloud
  • Evaluate & build different frameworks for all building blocks

This is an exciting opportunity for a skilled Data Engineer, particularly those who are looking to move into the financial markets industry, to join a fast-growing Investment Management institution.

Senior Big Data Engineer

4 days ago
Cornwaliis Elt - London, UK

Senior Data Engineer – Hadoop, AWS, Amazon Web Services, Big Data, Spark, Storm, Investment Manager, Asset Manager, Finance

A world-leading investment manager based in the City are looking to hire a small team of Data Engineers to build a brand-new AWS based data-lake. The team will work in conjunction with their digital department to provide a new big data analytics service for their retail investment platform.

The role is to establish Big Data platforms on an AWS infrastructure to provide analytics services. As such you will have experience across a number of Big Data technologies e.g. Hadoop, Kafka, Storm etc.


Key Requirements:
  • Strong knowledge and experience in AWS Stack (Kinesis, Lambda, EMR, S3, DynamoDB, RDS), Attunity
  • Experience in Data Model and Business intelligence
  • Good communication skills

Key Responsibilities:
  • Design, develop and deliver scalable and automated data products
  • Create frameworks for ingestion both real time and batch
  • Create frameworks and components for Data
  • Data Modeling for new Data Products
  • Create Data Catalogs for all entities on boarded and for all entities exposed for distribution
  • Create Re-con process for all Data Pipelines built
  • Leverage IAM roles & policies for service authentication
  • Documentation of functional logic of services
  • Testing (Functional & non-functional) of the pipeline
  • Deploy data products on cloud
  • Evaluate & build different frameworks for all building blocks

This is an exciting opportunity for a skilled Senior Data Engineer, particularly those who are looking to move into the financial markets industry, to join a fast-growing Investment Management institution.

Contract Java Developer - £425pd

4 days ago
Deerfoot IT Resources Ltd - Ascot, UK
Contract Java Developer
circa 2 days near Ascot, Berkshire, 2 days from home, + 1 day near Fareham, Hampshire
Micro-service web-based apps
6 months +
up to £425pd

We have a close relationship with this growing software house forged on past personal friendship and commercial delivery. A number of roles presently exist to work on a client based project (a national organisation) to push this well-known organisation's technology capability further.

We are looking for a Java Developer to work on site at a national organisation to assist with the development of enterprise class micro services based web apps (AWS Cloud). You will be working on interesting development work to push this well-known organisation's technology capability further.

Key skills / environment

Java development
Java 8
AWS Cloud services
CI - continuous integration
High availability, multi-threaded apps.
RESTful APIs
SQL
MongoDB
Docker
Ansible / Puppet / Chef / Vagrant
Hibernate
JMS messaging - ActiveMQ, RabbitMQ, Kafka
Agile - Scrum / Kanban
TDD - Junit / Mockito
JavaScript

We are one of the UK's most established IT recruitment organisations. A stable partner for your freelance career. We are an approved partner to our client for these Java Developer roles. We never send your details anywhere without your email consent. We donate £1 to our partner charity The Born Free Foundation each time we send a CV to a hiring client. These roles are based in Hampshire

Data Engineer

5 days ago
Randstad Employment Bureau - Brighton, UK

Job Title: Data Engineer

Day Rate: £450 - £500

Contract Duration: 3 - 9 Months Rolling Contract

Location: Brighton, Hove

Randstad have an exciting new role with a well known energy supply company. This organisation is the UK's largest producer of low-carbon electricity, the biggest supplier of electricity by volume in Great Britain and the largest supplier to British businesses. They're now looking to hire an experienced Data Engineer to join their growing team!

What We Are Looking For:

We are looking for an experienced Data Engineer who will work in the smarter living area of the lab, you will be joining the team to develop solutions in order to deliver scalable applications on the platform and maintain the existing code base, databases and APIs.

Role Responsibilities:

  • Evaluate different technical options for developments, which will have a significant contribution to business success, balancing risk with opportunity

  • Actively participate in Sprint planning sessions providing accurate estimates for each task.
  • Analyse builds proactively to identify problems and trends and propose technical solutions and recommend changes to optimise system performance and usability.
  • Work with stakeholders including Product, Data, Design, Architecture and Business teams to
    assist with data-related technical issues and support their data infrastructure needs.

Essential Skills:

  • Experience with big data tools Hadoop, Spark, Kafka etc.
  • Experience with data pipeline tools such as Azkaban, Luigi, Airflow, Cask.
  • Experience with Java, Python,Scala.
  • Experience with AWS including EC2 , EMR , Athena, S3 , RDS , Redshift , as well as an understanding of VPCs, Cloud Formation, SQS, and other Amazon PaaS tools.
  • Experience with SQL and NoSQL databases including Postgres , Aurora/MySQL and DynamoDB.
  • Familiar with Agile and SCRUM delivery.

We would be delighted to hear from you if you have these skills and experience!



Randstad Business Support is acting as an Employment Business in relation to this vacancy.