Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.
Remote
0
USD
/hr

9 remote Kafka contracts

Scala Developer (Java, API, Microservices, Kafka, Cloud, NOS...

25 days ago
$55 - $70/hour (Estimated)RemoteSpiceorb
Title: Scala Developer (Java, API, Microservices, Kafka, Cloud, NOSQL)
Location: Bentonville, Arkansas
No of Positions: 8

REQUIREMENTS:
  • Min three years of hands on experience in projects in Scala & Kafka
  • Must have worked on Play / AKKA framework
  • Expertise in tool kits Akka, SBT Parser:Lift JSON
  • Server-side experience in JDBC, JSP, SAX/DOM, Web Services, SOAP, WSDL, UDDI, JAXB
  • Must be able to write complex SQL statements
  • Should be able to demonstrate experience in
o SCM: Git, SVN, ClearCase Good understanding of JVM
o Build: NAnt, sbt, FMake, NuGet, gulp
o Application Containers: Apache, Tomcat, Jetty
o Web: Web Logic 5.x/6.x, Websphere 3.5/4, Play Framework, Spray
  • Should have worked in Azure or Google or AWS environment
  • Experience in NoSQL solution such as MongoDb, or Cassandra will be a plus

We are looking for someone who must have below mentioned experience in recent project:-

  • who might have started their career as a Java Developer but recent experience should be in Scala Scala with API Microservices Development ( 1 year of experience is must)
  • Scala with API Microservices Development
  • Complete Backend with Scala
  • Scala is advanced functional programming
  • Scala Play / Akka
  • Kafka: Message Service
  • On Cloud experience
  • NoSQL Database

This position is based in Bentonville, Arkansas though can discuss remote working options if required
Get new remote Kafka contracts sent to you every week.
Subscribed to weekly Kafka alerts! 🎉

DataOps Tech Lead

1 month ago
£650 - £700/dayRemoteHarnham

Data Ops Tech Lead
Remote/London
6-month Contract
£700 per day

As a Tech Lead, you will be building the infrastructure of an AWS platform for a niche IoT company. You will be working alongside the Data Scientists.

THE COMPANY:
This company are a niche start-up who have been financially backed by a large finance company. Their mission is to support people and business by tracking all bad content on social media and removing this from feeds. They are established in the US and are now targeting the UK. They have a team of Data Scientists working with real-time data which they supply to the Software team for them to build their products.

THE ROLE:
You are required to have expertise working with AWS as you will be building the infrastructure for the Engineers to store their data in. You will be introducing Kubernetes for deployment and Ansible/Terraform for automation. You are also required to build the CI/CD pipelines and if you have experience working with Kafka, that would be desirable. This is a fantastic opportunity to setup a cloud infrastructure for a niche start-up who have access to Big Data.

YOUR SKILLS AND EXPERIENCE:
The ideal Tech Lead will have:

  • Built a greenfield infrastructure using AWS, from scratch
  • Worked in an Agile environment building CI/CD pipelines
  • Introduced Kubernetes for deployment
  • Implemented infrastructure-as-code methodologies

HOW TO APPLY:
Please submit your CV to Henry Rodrigues at Harnham via the Apply Now button.
Please note that our client is currently running a fully remote interview process, and able to on-board and hire remotely as well.

Java /J2EE Developer

9 days ago
RemoteCalypsoWay

Job Title: Java /J2EE Developer

Location: Austin,TX

  • Strong programming expertise in Java, Multi-threading, Non-blocking IO, Spring, Kafka, etc.
  • Professional experience working with Java Collections, Multi-threading, spring boot 2, spring security and microservices
  • Excellent programming hands-on in java 8
  • Hands on experience publishing and consuming RESTful services (Swagger, WADL etc).
  • Should be able to performance tune application code and assist team members in reviewing code and provide suggestions to developers.
  • Implemented solutions that handle high availability and concurrency with low latency requirements.

Thanks, Please SEND your resume.

Job Type: Contract

Work Location:

  • Fully Remote

Schedule:

  • Monday to Friday

Java Developer

3 days ago
$45 - $55/hourRemoteeTek IT Service | Savvysol

Contract on W2

Must Needed:

-Targeting candidates with 3-7 years’ experience with strong Java background

-Tech Finance team is moving from retail accounting practices to cost accounting practices—This resource will be involved in the process of taking data from current sources, building applications using Java & transferring to accounting/cost-based systems

-Need candidates with recent exposure to AWS/Kubernetes

-Prefer candidates with Gitlab experience

-Will report directly to Ashley Hansen and will work alongside team of 6 Devs

-Resource will sit remote to start and potentially rejoin the team onsite in the Seattle office (open to fully remote options who will consider Seattle relo down the road)

Job Description

Software Engineers at Nordstrom apply their skills and talents to build scalable and sustainable solutions, utilizing best engineering practices and the latest Cloud technologies in an agile, team-oriented and collaborative environment.

We are seeking a Software Engineer to join our Finance Technology Delivery team as we intensify the pace of innovation and support future growth through technology solutions. The Software Engineer will be responsible for building mission-critical ETL pipelines that integrate source data with financial SaaS and on-premise applications.

A day in the life…

  • Design and collaborate with local and partner teams
  • Define and build data sources, data mappings, and integration points across a diverse application portfolio
  • Demonstrate thorough knowledge of information technology concepts issues, trends and best practices as they relate to Cloud technologies and system integrations
  • Identify performance issues; apply knowledge of security coding practices and secure system fundamentals

Skills:

  • Experience building ETL pipelines in Cloud technologies (AWS or Google Cloud Platform)
  • Experience with publish/subscribe and event-based data flow methodologies
  • Experience with Java development, GIT version control, and CI/CD (Continuous integration/continuous deployment)
  • Good working knowledge of hybrid cloud system integrations
  • Good working knowledge of containerization technologies including Docker and Kubernetes
  • Good working knowledge of REST services, JSON, XML, and Kafka streams
  • Exposure to GIT Lab, Python, Big Query, Cloud Data Flow, or Apache Beam
  • 2+ years of professional experience in practice area
  • Bachelor’s or Master’s degree in CS, Engineering or equivalent practical experience

Job Types: Full-time, Contract

Salary: $45.00 to $55.00 /hour

Experience:

  • AWS or Google Cloud Platform: 3 years (Preferred)
  • java: 4 years (Preferred)
  • Kubernetes or Docker: 2 years (Preferred)

Application Question:

  • Are you ok with W2?

Work Remotely:

  • Temporarily due to COVID-19

BigData Solutions Engineer

14 days ago
$62 - $68/hourRemoteOmnipoint Services Inc

Our fortune client is looking for a talented Solutions Engineer.. This is one of our top clients and we have been successful in building out entire teams for this organization. This role will be temp to permanent, 40 hours/week, paid on an hourly rate plus very highly subsidized benefits. This role will start working remotely but after Covid restrictions are lifted, the goal is to have this person onsite in Hartford CT

  • 6+ years Hortonworks HDP Solution Architect helping re-solution the migration projects from HDP 2.6 to 3.1.
  • Thorough understanding of HDP 2.6 and 3.1 platforms and related tech stack.
  • Good documentation (Vizio) and presentation(PPT) skills.
  • HDP 2.x and HDP 3.x

Deliverables:

  • Review project current solution, document the proposed solution review with involved groups, help engineering teams implement the solution end to end with low level technical recommendations and code review.
  • Document existing and new solution patterns.

Tools involved:

  • Apache Hadoop 3.1.1(Hadoop File System)
  • Apache HBase 2.0.0(Java APIs)
  • Apache Hive 3.1.0(Hive Query Language)
  • Apache Kafka 1.1.1(Java/Python/Spark streaming APIs)
  • Apache Phoenix 5.0.0(Standard SQL, JDBC, ODBC)
  • Apache Pig 0.16.0
  • Apache Ranger 1.1.0
  • Apache Spark 2.3.1(Java, Scala, Python)
  • Apache Sqoop 1.4.7
  • Apache Tez 0.9.1

Java based web services APIs and python clients.

Job Types: Full-time, Contract

Pay: $62.00 - $68.00 per hour

Experience:

  • Apache Hive 3.1.0(Hive Query Language): 4 years (Required)
  • Apache Kafka 1.1.1(Java/Python/Spark streaming APIs): 4 years (Required)
  • Apache HBase 2.0.0(Java APIs): 4 years (Required)
  • Apache Ranger 1.1.0: 4 years (Required)
  • Java based webservices APIs and python clients: 2 years (Required)
  • Apache Spark 2.3.1(Java, Scala, Python): 4 years (Required)
  • Hortonworks HDP Solution Architect: 8 years (Required)
  • Apache Pig 0.16.0: 4 years (Required)
  • Apache Hadoop 3.1.1(Hadoop File System): 4 years (Required)
  • HDP 2.6 and 3.1 platforms and related tech stack: 5 years (Required)
  • Apache Phoenix 5.0.0(Standard SQL, JDBC, ODBC): 4 years (Required)

Work Remotely:

  • No

Java Developer with Apache, Camel

5 days ago
$70 - $75/hourRemoteANB Sourcing LLC

Sr Java Developer with Apache, Camel

3+ months contract.

100% REMOTE

Note: Need only visa independent consultant only.

Strong hands experience on Java 8 (Minimum 5 yrs exp as Java Developer)
Strong in Apache Camel (Minimum 3yrs experience as Camel developer)
Demonstrated experience in Agile development, application design, software development, and testing.
Preferred...
Experience on Asynchronous Integration design and implementation using AMQ, AMQ Streams (Kafka)
Experience on RESTful API design and implementation.
Experience in relational DB. Knowledge on NoSQL / DB is a plus
Working experience of Continuous Integration and Continuous Delivery using tools like Maven, Jenkins
Working experience with source code control system like GitHub
Desirable to have experience of writing applications for deployment on cloud environment including OpenShift.

Job Type: Contract

Pay: $70.00 - $75.00 per hour

Schedule:

  • Monday to Friday

Experience:

  • apache: 3 years (Required)
  • Java: 8 years (Required)
  • camel: 3 years (Required)

Work authorization:

  • United States (Required)

Contract Length:

  • 3 - 4 months

Contract Renewal:

  • Likely

Full Time Opportunity:

  • No

Work Location:

  • Fully Remote

Company's website:

  • www.anbsourcing.com

Work Remotely:

  • Yes

Scala Developer Remote

1 month ago
$60/hourRemoteXyant Services

We are looking for Java Developers with Scala.

Location: Bentonville, AR. Remote

- Candidate should have real time experience in scala recent projects.

- Should have at least two experience in scala API development not in Hadoop

- Scala with API Microservices Development

Requirements:

  • Min three years of hands on experience in projects in Scala & Kafka
  • Must have worked on Play / AKKA framework
  • Should have worked in Azure or Google or AWS environment
  • Five plus years of Java Development experience (Java 8 functional programming is must)
  • Expertise in tool kits Akka, SBT Parser: Lift JSON
  • Server-side experience in JDBC, JSP, SAX/DOM, Web Services, SOAP, WSDL, UDDI, JAXB
  • Must be able to write complex SQL statements
  • Should be able to demonstrate experience in

- SCM: Git, SVN, ClearCase Good understanding of JVM

- Build: NAnt, sbt, FMake, NuGet, gulp

- Application Containers: Apache, Tomcat, Jetty

- Web: Web Logic 5.x/6.x, Websphere 3.5/4, Play Framework, Spray

  • Experience in NoSQL solution such as MongoDb, or Cassandra will be a plus

Job Type: Contract

Salary: $60.00 /hour

Cloud/DevOps Engineer

17 days ago
RemoteVerstand AI

POSITION: GCP / Confluent DevOps Engineer

LOCATION: Remote / Washington, DC

POSITION HIGHLIGHTS:

Verstand AI (www.verstand.ai) is seeking Google Cloud Platform (GCP) DevOps Engineers with strong SQL expertise, Python and Kafka (Confluent) proficiency. The DevOps engineers will be instrumental in significant initiatives to transform all aspects of environment management, continuous integration and delivery for Verstand commercial clients. The work will lead to implementing best practice approaches for enterprise data warehousing, business intelligence and data wrangling/ELT/ETL. This individual will work closely with the business stakeholders, software development and support teams. Most importantly, Verstand AI's Cloud DevOps engineers will get an opportunity to work with cutting edge technologies and be part of data teams that help clients with end to end data science programs.

KEY RESPONSIBILITIES:

  • Design and implement a containerization strategy that could be applied to Ops for Google Cloud-based environment
  • Automate management and orchestration tasks.
  • Building CI/CD pipelines for Microservices
  • Conduct root cause analysis for container runtime problems
  • Author documentation and procedures for DevOps in a Google cloud-based environment.
  • Monitor, measure, and automate all things to ensure exceed performance and availability goals
  • Identify bottlenecks in development and deployment processes
  • Participate and potentially lead technical presentations on the work.
  • Understand the current systems, algorithms, and cloud-based HPC architecture
  • Instrument the infrastructure with frameworks that can be appropriately adopted for logging, monitoring, and alerting
  • Participate in team meetings, interface independently with SMEs, and interact with client staff

JOB REQUIREMENTS:

Minimum Experience, Skills and Education:

  • 6+ years of experience in cloud environment, distributed systems, system automation, and real-time platform.
  • 5 + years production experience with cloud technologies such as Google Cloud Platform (GCP), Azure and Amazon Web Services (AWS)
  • 2+ years design and maintenance expertise with system administration of Cloud infrastructure, including Amazon Web Services, Google Cloud Platform, and/or Microsoft Azure cloud services.
  • Experience with cloud databases.
  • Experience with batch and stream processing
  • Experience with managing large scale data processing systems
  • Experience with agile software development practices and drive to ship quickly
  • Experience leading change, taking initiative, and driving results
  • Effective communication skills and strong problem-solving skills
  • Proven ability and desire to mentor others in a team environment
  • Bachelor's degree from four-year College or university in Computer Science, Technology or related field

Experience That Sets You Apart:

  • Experience with the Google Cloud Platform
  • Experience with Apache Kafka and Confluent
  • Familiarity with Python
  • Experience with microservice platforms, API development, and containers.
  • Retail vertical production experience

Verstand AI is a fast-growing firm that believes in ongoing training and development for its staff. The firm's mission is to help both its commercial and public sector clients resolve data management challenges and move to delivering insight and benefits for stakeholders, customers and constituents.

Based out of Tysons Corner, VA, Verstand does business across the United States and is moving into Europe, Africa and Asia. If you're interested in working with us and have a desire to tackle challenging data problems, we welcome your interest and encourage you to apply.

Job Types: Full-time, Contract, Permanent position opportunity

Job Types: Full-time, Contract

Experience:

  • Google Cloud Platform: 2 years (Required)
  • DevOps: 5 years (Required)
  • Python: 2 years (Required)
  • Apache Kafka: 2 years (Required)
  • Cloud: 5 years (Required)

Work authorization:

  • United States (Required)

Contract Renewal:

  • Likely

Full Time Opportunity:

  • Yes

Additional Compensation:

  • Bonuses
  • Other forms

Work Location:

  • Fully Remote

Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Retirement plan
  • Paid time off
  • Professional development assistance

This Company Describes Its Culture as:

  • Team-oriented -- cooperative and collaborative
  • Outcome-oriented -- results-focused with strong performance culture
  • Innovative -- innovative and risk-taking

Schedule:

  • Monday to Friday

Company's website:

  • www.verstand.ai

Benefit Conditions:

  • Only full-time employees eligible

Work Remotely:

  • Yes

Data Engineer

1 month ago
RemoteGeorgia IT Inc.

We are looking for strong Data Engineers, skilled in Hadoop, Scala, Spark, Kafka, Python, and AWS. I've included the job description below.
Here is what we are looking for:

Overall Responsibility:

  • Develop sustainable data driven solutions with current new gen data technologies to meet the needs of our organization and business customers.
  • Apply domain driven design practices to build out data applications. Experience in building out conceptual and logical models.
  • Build out data consumption views and provisioning self-service reporting needs via demonstrated dimensional modeling skills.
  • Measuring data quality and making improvements to data standards, helping application teams to publish data in the correct format so it becomes easy for downstream consumption.
  • Big Data applications using Open Source frameworks like Apache Spark, Scala and Kafka on AWS and Cloud based data warehousing services such as Snowflake.
  • Build pipelines to enable features to be provisioned for machine learning models. Familiar with data science model building concepts as well as consuming and from data lake.

Basic Qualifications:

  • At least 8 years of experience with the Software Development Life Cycle (SDLC)
  • At least 5 years of experience working on a big data platform
  • At least 3 years of experience working with unstructured datasets
  • At least 3 years of experience developing microservices: Python, Java, or Scala
  • At least 1 year of experience building data pipelines, CICD pipelines, and fit for purpose data stores
  • At least 1 year of experience in cloud technologies: AWS, Docker, Ansible, or Terraform
  • At least 1 year of Agile experience
  • At least 1 year of experience with a streaming data platform including Apache Kafka and Spark

Preferred Qualifications:

  • 5+ years of data modeling and data engineering skills
  • 3+ years of microservices architecture & RESTful web service frameworks
  • 3+ years of experience with JSON, Parquet, or Avro formats
  • 2+ years of creating data quality dashboards establishing data standards
  • 2+ years experience in RDS, NOSQL or Graph Databases
  • 2+ years of experience working with AWS platforms, services, and component technologies, including S3, RDS and Amazon EMR

Job Type: Contract

Schedule:

  • Monday to Friday

Experience:

  • AWS: 1 year (Preferred)
  • Hadoop: 1 year (Required)
  • Spark: 1 year (Required)
  • Big Data: 1 year (Preferred)
  • Scala: 1 year (Preferred)
  • Data Engineering: 1 year (Required)

Contract Renewal:

  • Possible

Full Time Opportunity:

  • Yes

Work Location:

  • Fully Remote

Work Remotely:

  • Yes