Kafka contract jobs near you / remote

Kafka

IT - Applications Development Consultant III

3 days ago
$50 - $65/hourRemoteEssani International

Title: **Data Engineer

Location: **Franklin, Tennessee 37067

Duration: **6+ months Contract Role

Mode of Interview : **Webex/Skype

Cloud and Big Data related Projects

BIE- Analytics

What are the top 5-10 responsibilities for this position: **

  • Designing and building production data pipelines from ingestion to consumption within a big data architecture, using Azure Services, Python, Scala or other custom programming
  • Provide guidance around modern data warehouse design, implementation for migration from on premises to Azure
  • Perform detail assessments of current state data platform and create an appropriate transition path to Cloud technologies
  • Responsible for developing, and implementing Big Data platforms using Cloud Platform with structured and unstructured data sources.
  • Connecting and automating data sources, along with building visualizations.
  • Configuring, connecting, and setting up the infrastructure.
  • What software tools/skills are needed to perform these daily responsibilities?
  • Python, Scala, Azure , ELT, ETL**

What skills/attributes are a must have?**

  • 3 years of Data Warehousing and Big Data Tools and Technologies.
  • Demonstrated expertise with object oriented development languages (.Net, Java, etc.)preferred
  • 1+ Experience with Programming Languages such as Python ,Java, Scala
  • 2+ Advanced experience with both Relational and No SQL Databases
  • 3+ Advanced experience in Data Modeling, Data Structures, and Algorithms
  • Experience with Cloud Environments such as AWS OR Azure
  • Advanced knowledge in Linux & shell programming

What skills/attributes are nice to have?**

  • Strong analytical skills in problem solving, troubleshooting, and issue resolution
  • Ability to communicate effectively (oral and written) across multiple teams, facilitate meetings, and coordinate activities
  • Experience with Kafka Understanding of CI/CD tools and technologies
  • Experience in Data Sciences & Machine Learning is a plus
  • Healthcare industry experience
  • Where is the work to be performed? (Please list preferred UHG facility, if other please specify i.e. remote work, rural, etc.)

Job Type: Contract

Salary: $50.00 to $65.00 /hour

Experience:

  • Python ,Java, Scala: 1 year (Preferred)
  • .NET: 1 year (Preferred)
  • AWS: 1 year (Preferred)
  • Scala: 1 year (Preferred)
  • Machine Learning: 1 year (Preferred)
  • Cloud Environments such as AWS OR Azure: 1 year (Preferred)
  • Relational and NoSQL Databases: 2 years (Preferred)
  • Data Modeling, Data Structures, and Algorithms: 3 years (Preferred)
  • Java: 1 year (Preferred)
  • Data Sciences & Machine Learning: 1 year (Preferred)
  • Healthcare industry: 1 year (Preferred)
  • Kafka Understanding of CI/CD tools and technologies: 1 year (Preferred)

Work Location:

  • One location
Get new remote Kafka contracts sent to you every week.
Subscribed to weekly Kafka alerts! 🎉 You can see it in your dashboard.

Kafka Data Engineer

4 days ago
£600 - £700/dayRemoteHarnham

Kafka Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Kafka Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Kafka Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Kafka Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Kafka Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Kafka Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Last 90 Days

Big Data Architect

6 days ago
£650 - £700/dayRemoteHarnham

Big Data Architect
£650-£700 per day
Initial 3 month contract
London/ Sweden

As a Big Data Architect you will be helping to create the Kafka architecture and outline the strategy for migration to the cloud!

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Big Data Architect you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside Data Engineers.

THE ROLE:

As a Big Data Architect, your main responsability will be creating the Kafka architecture from design to implementation.. Therefore it is imperative that you have extensive experience with Kafka for large implementations, ideally Confluent Kafka. As a Big Data Architect yit is essential you have a good understanding of technologies such as Spark and Hadoop as you will helping to implement these. You will be working in both an on premise environment as well as cloud environmentsand so it is valuable if you have worked in either AWS or Azure as a platform. Though you will heavily be involved in the planning and writing of roadmaps you must be prepared to be hands in and therefore previous experience programming in Scala/ Java is valuable. As you will be working for a consultancy it is essential that you are confident speaking with non technical people as this role will be very client facing.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Architect will have the following skills and experience:

  • Extensive experience implementing strategies using Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • Experience speaking to stakeholders
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Data Engineer

6 days ago
£600 - £700/dayRemoteHarnham

Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Architect

6 days ago
£650 - £700/dayRemoteHarnham

Big Data Architect
£650-£700 per day
Initial 3 month contract
London/ Sweden

As a Big Data Architect you will be helping to create the Kafka architecture and outline the strategy for migration to the cloud!

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Big Data Architect you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside Data Engineers.

THE ROLE:

As a Big Data Architect, your main responsability will be creating the Kafka architecture from design to implementation.. Therefore it is imperative that you have extensive experience with Kafka for large implementations, ideally Confluent Kafka. As a Big Data Architect yit is essential you have a good understanding of technologies such as Spark and Hadoop as you will helping to implement these. You will be working in both an on premise environment as well as cloud environmentsand so it is valuable if you have worked in either AWS or Azure as a platform. Though you will heavily be involved in the planning and writing of roadmaps you must be prepared to be hands in and therefore previous experience programming in Scala/ Java is valuable. As you will be working for a consultancy it is essential that you are confident speaking with non technical people as this role will be very client facing.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Architect will have the following skills and experience:

  • Extensive experience implementing strategies using Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • Experience speaking to stakeholders
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Data Engineer

6 days ago
£600 - £700/dayRemoteHarnham

Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

AWS Engineer / Machine learning

8 days ago
$65 - $75/hour (Estimated)Herndon, VAInteletech global, Inc

AWS Engineer

Herndon, VA

USC or GC only

Minimum Requirements: * Experience working with Amazon Machine Learning on AWS (ideally with AWS Certified Machine Learning Specialty certification).* General knowledge of ML models, Bayes network, Neural network, or ontologies.* Strong data sciences foundation as well as hands-on database development experience. * Experience coding in either Java or Python.* General knowledge of AWS environment configuration and administration.* Business Analytical Skills: ability to apply business logic to design and implement solutions on large data sets.* Excellent organizational and communication skills (oral and written).

Required Function 2:

Other Desired Qualifications: * Bachelor's degree in Information Systems or Computer Science* Minimum of 5 years general IT experience* Knowledge of Kafka Streams/Confluent or similar ETL solutions

Job Type: Contract

Experience:

  • AWS: 5 years (Required)
  • Machine Learning: 4 years (Required)
  • kafka: 1 year (Preferred)
  • java: 3 years (Preferred)
  • python: 2 years (Preferred)

Work Location:

  • One location

Contract Length:

  • 1 year

Contract Renewal:

  • Likely

Software Engineer

9 days ago
$55 - $70/hour (Estimated)Herndon, VA 20170SV Professionals LLC

Title : Software Engineer

Location : Milpitas , CA

Client : TATAELXSI

Description :

5 years of experience is mandatory .

Candidate must have minimum requirements outlined:

1 Must have 4+ years of server-side java coding.

2 Proficiency on spring/Spring Boot , multi threading .

3 Must have real/ industry project experience on Maven , GIT , CI/CD pipeline , rest API.

4 BSCS/MSCS or equivalent is required.

Desired Skills:

Have experience is Agile practice.

Experience with micro services development.

Experience in postgres , AWS S3 , aws elastic cache for redis , Apache Casandra and/or apache kafka is a plus.

Strong knowledge on JVM

Job Type: Contract

Experience:

  • Java: 7 years (Required)

Backend Java Developer

9 days ago
Reston, VAObject Technologies Inc

I’m looking for backend experience with microservices/java, couchbase, oracle, kafka, docker, openshift, and the ability to articulate concepts well.

And order processing experience and/or telecom experience would be a plus.

Job Type: Contract

Experience:

  • oracle: 3 years (Preferred)
  • Java: 6 years (Required)
  • microservices: 3 years (Required)

Location:

  • Reston, VA (Required)

Sr. Splunk Developer

16 days ago
$55 - $70/hour (Estimated)Reston, VAMW Partners

Job Title: Sr. Splunk Developer with AWS Exp

Location: Reston, VA

Duration: 12+ Months

Job Desciption:

  • Overall 8+ years of IT experience with knowledge in application design, interaction within application tiers (web, database, application servers), messaging (EMS, Kafka, RabbitMQ)
  • Expertise in Splunk Searching and Reporting modules, Knowledge Objects, Administration, Dashboards, Clustering and Forwarder Management.
  • Experience with log parsing, complex Splunk searches, including external table lookups.
  • Generate or enhance Splunk Dashboards, Reports, and Alerts.
  • Create Dashboards, Visualizations, Statistical reports, scheduled searches, alerts and knowledge objects.
  • Designing and maintaining production-quality custom Splunk dashboards – using javascript, CSS, advanced HTML
  • Splunk search construction with ability to create well-structured search queries that minimize performance impact.
  • Experience with Splunk Apps for interfacing with infrastructure and tools (DBConnect, Jenkins App.. )
  • Experience in automating Splunk Deployments and orchestration within AWS Cloud environment – experience with Splunk on AWS App a plus
  • Good working exp in AWS – certification is a plus
  • Hands on experience in creating AWS resource utilizations and usage monitoring dashboard independently
  • Good understanding on how to ingest data from relevant AWS data sources – Cloudwatch, Cloudtrail, S3
  • Mid Level Unix admin skills – resource utilization, process monitoring, file system management
  • Splunk Certification is a plus – User, PowerUser, Advanced Dashboarding
  • Advanced Shell scripting knowledge (Csh, Bash), Perl, Python

Job Type: Contract

Experience:

  • AWS Cloud: 3 years (Required)
  • AWS: 4 years (Required)
  • ADMINISTRATOR: 4 years (Required)
  • Splunk Searching: 4 years (Required)
  • Splunk: 3 years (Required)

Telematics Platform Engineer – Java, Go, and Kafka

16 days ago
RemoteSyrinx
We are adding to the team of the world’s largest shared-mobility company. We are looking for capable, passionate people with great ideas to join us building a platform responsible for processing the data from over 600,000 access points. Our Client's work environment is supportive, diverse, and fun.
Global company, and fully support partial remote work for this role. Occasional travel to one or more of these offices may be requested.
Can not do sponsorship (no C2C) for these roles. Can be contract, contract to hire, or full-time.
Technical Requirements
Knowledge of event based microservices architecture patterns
Reasoned API design opinions
Comfort level integrating with evolving external system dependencies
Interest in IOT platform communication and centralized device management
Comfortable working with relational databases, particularly Postgres
Some experience with:
o Kafka or similar distributed messaging service, Container runtimes, AWS Cloud
o K8S (self-hosted or a managed flavor), Functions as a service (e.g. Lambda)
Senior level skills in at least one of the following languages:
o Go, Java, Python
Any experience with the following technologies and patterns would be nice, but not required:
o Couchbase, Bosh/CloudFoundry, Concourse, Vault
o Façade design pattern
o Event sourcing design pattern
o Strangler design pattern

Junior Big Data Engineer

17 days ago
Chantilly, VAGeneral Dynamics Mission Systems
Basic Qualifications
Bachelor's degree in software engineering or a related technical field is required (or equivalent experience). Agile experience preferred.

KEY SKILLS
  • Experience in designing, developing, building, and implementing Big Data solutions or developing automated solutions to solve complex problems, a thoughtful ability to solve problems could outweigh years of experience
  • Ability to identify and implement a data solution strategy
  • Demonstrates intellectual curiosity in exploring new technologies and finding creative ways to solve data management problems
  • Understanding in developing solutions with Python/Javascript/PERL
  • Knowledge of Spark, Impala, Hadoop, Streamsets, Kafka, Rest APIs
  • Experience in SQL-based and NoSQL-based technologies
  • DoD 8570.1M compliant IAT II (i.e. Security+, etc.)
Ability to obtain a TS/SCI with Polygraph security clearance required. Candidate must be able to obtain the Polygraph within a reasonable amount of time from date of hire. Applicants selected will be subject to a U.S. Government security investigation and must meet eligibility requirements for access to classified information. Due to the nature of work performed within our facilities, U.S. citizenship is required.
Responsibilities for this Position
General Dynamics Mission Systems (GDMS) is seeking motivated candidates to join our insider threat detection, systems integration team. Our mission oriented team is responsible for the design, testing, deployment, maintenance, operation, and evolution of the systems directly supporting the insider threat detection program of a large government customer in the United States Intelligence Community (USIC).

GDMS has an immediate opening on the team for a motivated Junior Big Data Engineer with a self-starter mindset who is up to date with the latest tools and techniques. The position will focus on the integration of new data management technologies and software performance tuning and troubleshooting. This is a challenging yet rewarding position that provides an opportunity to leverage cutting edge technologies in pursuit of a vital mission that protects people, sensitive information/technologies, and the national security posture of the USIC.

The majority of work will be performed in Chantilly, Virginia, which is located approximately 25 miles west of Washington D.C., near the Dulles International Airport. The selected Junior Big Data Engineer will support a 6+ year contract that General Dynamics recently secured.

CORE RESPONSIBILITIES:
  • Assist in the development and delivering of large scale data pipelines
  • Develop and implement automated tests for data transformations and data migrations
  • Research and apply big data solution technologies to complex datasets; make recommendations to data science team on new technologies
Company Overview
General Dynamics Mission Systems (GDMS) engineers a diverse portfolio of high technology solutions, products and services that enable customers to successfully execute missions across all domains of operation. With a global team of 13,000+ top professionals, we partner with the best in industry to expand the bounds of innovation in the defense and scientific arenas. Given the nature of our work and who we are, we value trust, honesty, alignment and transparency. We offer highly competitive benefits and pride ourselves in being a great place to work with a shared sense of purpose. You will also enjoy a flexible work environment where contributions are recognized and rewarded. If who we are and what we do resonates with you, we invite you to join our high performance team!

Big Data Architect

18 days ago
Chantilly, VAGeneral Dynamics Mission Systems
Basic Qualifications
Bachelor's degree in software engineering or a related technical field is required (or equivalent experience), plus a minimum of 8 years of relevant experience; or Master's degree plus a minimum of 6 years of relevant experience. Agile experience preferred.

KEY SKILLS:
  • Ability to identify and implement a big data solution strategy
  • Demonstrates intellectual curiosity in exploring new technologies and finding creative ways to solve data management problems
  • Provide thought leadership around industry standard tools and data models (including commercially available models and tools) by leveraging past experience and current big data industry trends
  • Make recommendations and assess proposals for data pipeline optimization
  • Identify operational issues and recommend/implement strategies to resolve problems
  • Experience using technologies such as Spark, Impala, Hadoop, Streamsets, Kafka , Nifi
  • Experience with SQL-based and NoSQL-based technologies and design
  • Ability to create clear, expert-level architectural diagrams using Visio, and clear narrative presentations using PowerPoint
CLEARANCE REQUIREMENTS:
A TS/SCI security clearance with the ability to obtain a Polygraph is required at time of hire. Candidate must be able to obtain the Polygraph within a reasonable amount of time from date of hire. Applicants selected will be subject to a U.S. Government security investigation and must meet eligibility requirements for access to classified information. Due to the nature of work performed within our facilities, U.S. citizenship is required.
Responsibilities for this Position
General Dynamics Mission Systems (GDMS) is seeking motivated candidates to join our insider threat detection, systems integration team. Our mission oriented team is responsible for the design, testing, deployment, maintenance, operation, and evolution of the systems directly supporting the insider threat detection program of a large government customer in the United States Intelligence Community (USIC).

GDMS has an immediate opening on the team for a motivated Big Data Architect with an agile mindset who is willing to stay up to date with the latest technology. This is a challenging yet rewarding position that provides an opportunity to leverage cutting edge technologies in pursuit of a vital mission that protects people, sensitive information/technologies, and the national security posture of the USIC.

CORE RESPONSIBILITIES:
  • Design customer's data topology with business intelligent Extract, Transfer, Load tools and data modeling structures
  • Provide expertise in both defining data structures, data layer and data integration practices for business intelligence and for operational uses
  • Understanding of schedulers, workload management, availability, scalability and distributed data platforms
  • Be a key member of the Big Data solutions/ services/ product development team that helps to strategize solutions
  • Work with technology partners and internal stakeholders in developing POCs and drive creative solutions in a fast paced agile environment
  • Contribute to solution success by calibrating and communication effectively at various levels, translate customer needs and synthesize high-level design/solutions, partner with solutions/ product management and influence the overall outcome
  • Mentor & train others in a fast-paced, collaborative, and high-growth security focused environment
The majority of work will be performed in Chantilly, Virginia, which is located approximately 25 miles west of Washington D.C., near the Dulles International Airport. The selected Big Data Architect will support a 6+ year contract that General Dynamics recently secured.

#CJ3
Company Overview
General Dynamics Mission Systems (GDMS) engineers a diverse portfolio of high technology solutions, products and services that enable customers to successfully execute missions across all domains of operation. With a global team of 13,000+ top professionals, we partner with the best in industry to expand the bounds of innovation in the defense and scientific arenas. Given the nature of our work and who we are, we value trust, honesty, alignment and transparency. We offer highly competitive benefits and pride ourselves in being a great place to work with a shared sense of purpose. You will also enjoy a flexible work environment where contributions are recognized and rewarded. If who we are and what we do resonates with you, we invite you to join our high performance team!

Big Data Engineer

22 days ago
Chantilly, VAGeneral Dynamics Mission Systems
Basic Qualifications
Bachelor's degree in software engineering or a related technical field is required (or equivalent experience), plus a minimum of 5 years of relevant experience; or Master's degree plus a minimum of 3 years of relevant experience. Agile experience preferred.

KEY SKILLS
  • Minimum three (3) years’ experience in designing, developing, building, and implementing Big Data solutions or developing automated solutions to solve complex problems, a thoughtful ability to solve problems could outweigh years of experience.
  • Ability to identify and implement a data solution strategy
  • Demonstrates intellectual curiosity in exploring new technologies and finding creative ways to solve data management problems
  • Experience developing solutions with Python/Javascript/PERL
  • Experience/knowledge of Spark, Impala, Hadoop, Streamsets, Kafka, Rest APIs
  • Experience in SQL-based and NoSQL-based technologies
  • Experience in Linux administration/troubleshooting
A TS/SCI security clearance with the ability to obtain a Polygraph is required at time of hire. Candidate must be able to obtain the Polygraph within a reasonable amount of time from date of hire. Applicants selected will be subject to a U.S. Government security investigation and must meet eligibility requirements for access to classified information. Due to the nature of work performed within our facilities, U.S. citizenship is required.
Responsibilities for this Position
General Dynamics Mission Systems (GDMS) is seeking motivated candidates to join our insider threat detection, systems integration team. Our mission oriented team is responsible for the design, testing, deployment, maintenance, operation, and evolution of the systems directly supporting the insider threat detection program of a large government customer in the United States Intelligence Community (USIC). GDMS has an immediate opening on the team for a motivated Big Data Engineer with a self-starter mindset who is up to date with the latest tools and techniques. The position will focus on the integration of new data management technologies and software performance tuning and troubleshooting. This is a challenging yet rewarding position that provides an opportunity to leverage cutting edge technologies in pursuit of a vital mission that protects people, sensitive information/technologies, and the national security posture of the USIC.

The majority of work will be performed in Chantilly, Virginia, which is located approximately 25 miles west of Washington D.C., near the Dulles International Airport. The selected Big Data Engineer will support a 6+ year contract that General Dynamics recently secured.

CORE RESPONSIBILITIES:
  • Assist in the development and delivering of large scale data pipelines
  • Develop and implement automated tests for data transformations and data migrations
  • Research and apply big data solution technologies to complex datasets; make recommendations to data science team on new technologies
#CJ3
#CB
Company Overview
General Dynamics Mission Systems (GDMS) engineers a diverse portfolio of high technology solutions, products and services that enable customers to successfully execute missions across all domains of operation. With a global team of 13,000+ top professionals, we partner with the best in industry to expand the bounds of innovation in the defense and scientific arenas. Given the nature of our work and who we are, we value trust, honesty, alignment and transparency. We offer highly competitive benefits and pride ourselves in being a great place to work with a shared sense of purpose. You will also enjoy a flexible work environment where contributions are recognized and rewarded. If who we are and what we do resonates with you, we invite you to join our high performance team!

Python Developer

25 days ago
$55 - $70/hour (Estimated)RemoteAgelix Consulting LLC

Application Design Development

'Required

  • Proven work experience in software development, especially Python and Linux

o Expertise in at least one popular Python framework (like Django, Flask or Pyramid) preferred

  • Ability to learn new languages and technologies
  • Resourcefulness and problem-solving aptitude
  • Team spirit + willingness to speak up!
  • Ready to work in the Plano office (some remote work OK, but mostly needs to be on site)

Primary Skills

'Required

  • Proven work experience in software development, especially Python and Linux

o Expertise in at least one popular Python framework (like Django, Flask or Pyramid) preferred

  • Ability to learn new languages and technologies
  • Resourcefulness and problem-solving aptitude
  • Team spirit + willingness to speak up!
  • Ready to work in the Plano office (some remote work OK, but mostly needs to be on site)

Secondary Skills

  • Experience with software development in a test-driven environment, especially using any / all of the following:

o Kafka / Kafka Connect
o Kubernetes, Docker
o Gitlab
o Windows
o Talend

  • Solid knowledge of SQL and scripting, for example:

o Microsoft SQL, SSIS Packages, Stored Procedures
o Able to use an SQL client and database administrator tool, such as DBeaver, to validate data in SQL tables and hive tables on HDFS

  • Experience in software quality assurance and/or strong knowledge of software QA methodologies, tools and processes

o Experience in writing clear, concise and comprehensive test plans and test cases
o Hands-on experience with automated testing tools (Robot Framework a plus)

  • Data management understanding (e.g. permissions, recovery, security and monitoring)
  • Experience working in an Agile/Scrum development process
  • Excellent communication skills
  • Analytical mind, attention to detail

Education

Masters or Bachelors in Computer Science and Engineering.

Job Types: Full-time, Contract

Experience:

  • Django, Flask or Pyramid: 5 years (Required)
  • Python: 5 years (Preferred)
  • Linux: 5 years (Required)

Logstash and Kibana (ELK) Developer

1 month ago
Ashburn, VATech Talenta
  • Candidate having Good exposure on Big data and data analytics
  • 10 plus years experience
  • Hands-on experience in Data Analytics with Elasticsearch, Logstash and Kibana (ELK), Kafka
  • Hands-on experience in real time log aggregation, analysis and querying.
  • Hands-on experience in Java application development

Job Type: Contract

Experience:

  • Logstash and Kibana (ELK): 5 years (Required)
  • Data Analytics: 7 years (Required)
  • Elasticsearch: 5 years (Required)
  • Java application development: 5 years (Required)

.Net Developer : 19-01702

1 month ago
$50 - $65/hour (Estimated)Reston, VA 20190Akraya Inc.
Akraya is looking for a ".Net Developer” for one of our clients. If the job description below is a fit, please apply directly or call “Harleen” at “ 408-816-2474”. If this position is not quite what you’ re looking for, visit akraya.Com and submit a copy of your resume. Our recruiters will get to work finding you a job that is a better match at one of our many clients.

Primary Skills: C#, .Net, SQL/T-SQL, Kafka
Duration: 6 Months
Contract Type: W2 or C2C

Position Description:
  • Sr. Level experience in .Net core, SQL, C# and Kafka System design process and solution development
  • Ensures business needs are being met
  • Promotes and supports company policies, procedures, mission, values, and standards of ethics and integrity
  • Provides direction and coordination between large projects and the technical execution plan
  • Provides supervision and development opportunities for associates
  • Troubleshoots and leads the development and implementation of production solutions
  • Drives the execution of multiple business plans and projects

Minimum Qualifications
  • Bachelor' s degree in Computer Science, Information Technology or related field and 5 to 7 years experience in computer programming, software development or related field OR Master' s degree in Computer Science, Information Technology or related field and 4 to 6 years experience in computer programming, software development or related field. •

Additional Preferred Qualifications:
  • C#.Net, Javascript, Angular JS, CSS, XML, XHTML, HMTL, SQL/T-SQL, Core Java, Node.Js, ASP.NET, Visual Studio
  • Strong working knowledge of modern front-end frameworks (Angular, React, etc.)
  • Strong working knowledge of object-oriented programming concepts and techniques
  • Strong working knowledge of RESTful API creation and consumption.
  • Experience implementing continuous integration and delivery models
  • Experience with build automation
  • Experience with deploying to cloud architecture.
  • Experience with test driven development and automation testing.
  • Experience with agile delivery, especially Kanban as well as DevOps.
  • Demonstrated success in distributed systems, especially REST based services.
  • Strong knowledge of security standards, practices, and architecture.
  • Enthusiasm, curiosity, energy, perseverance, integrity and desire to solve big problems.

Please apply directly with your updated resume or call “ Harleen” at 408-816-2474

About Akraya
Akraya, Inc. Is an award-winning staffing firm that works with many of the leading, technology-based companies around the world. We have been ranked as one of the “ Best Staffing Firms to Temp for” by Staffing Industry Analysts on multiple occasions and are a preferred staffing vendor within numerous staffing programs. Please visit akraya.Com to search through all of our current openings or to submit your resume to our recruiting team.

Platform Engineer

1 month ago
£600 - £650/dayRemoteLinux Recruit
We have a new opportunity to work as a platform engineer building and launching a next generation digital platform for some of the most up and coming names in the financial space.

This is an opportunity to work with a challenger consultancy who are putting together a stellar team who are going to build out a new state of the art banking platform for a number of new banking start-ups in the UK, US and EU. The team you would be working within are a team who have an impressive record and have co-founded and grown banks including Monzo, Starling and Tandem and worked on digital transformations within large scale institutions including Lloyds and Barclays.

This is a company are doing some very interesting projects in the Fin-Tech space and they are looking for engineers who have experience working with:
  • AWS
  • Kubernetes
  • Kafka
  • Golang
  • Jenkins
  • Terraform

This is an opportunity to work on a large scale greenfield project with some of the most exciting technology and companies on the market at the moment. They are also modern in their approach to management and are open to the prospect of some remote working.

If you would like to hear more information, please give me a call on 07441390330 - drop me an email or apply to this advert and we can discuss it further.

Big Data Architect

1 month ago
RemotePerformance Softech Solutions Inc.

Duration: 3-6 months, could go perm

Start Date: ASAP

Location: Client site, Bethesda Maryland

Remote work allowed (y / n): No

Must Haves:

  • Mandatory Hands on experience with either SPARK or Elastic Search
  • Experience with architecture and developing enterprise applications that deal with data processing
  • Should be able to set up the tech stack locally, think through the solution options specifically from a faster data processing standpoint, and guide a set of developers on low level design
  • Should be able to work with disparate enterprise teams to gauge requirements and convert them into scalable application design
  • Work with the offshore team creating design elements, guiding them on the application design based on requirements, and reviewing their deliverables

Experience with the following technologies:

  • Spark
  • Elastic Search
  • KAFKA
  • Couchbase

Description
This is a data engineer within our North America team. The data engineer will collaborate within a team of technologists to produce enterprise scale solutions for our clients’ needs. They will utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design decisions to insure the necessary health of the overall solution.
Primary Responsibilities: *

Your role is focused on Design, Development and delivery of solutions involving:

  • Data Integration, Governance & Wrangling
  • Data Storage and Computation Frameworks, Performance Optimizations
  • Analytics & Visualizations
  • Infrastructure & Cloud Computing
  • Data Management Platforms

Experience Guidelines:

  • More than 5+ years of experience of application development using Java/Scala
  • More than 3+ years of experience in Big Data application development
  • Hands-on experience with the Hadoop stack (Hadoop MR, Hdfs, Pig, Hive, Sqoop)
  • Experienced in computation frameworks like Spark, Storm, Flink using Java/Scala
  • Well versed with streaming data processing using Kafka, Spark Streaming, Storm etc.
  • Experience in working on Big Data application in cloud environment using either AWS, Azure or Google cloud platform
  • Excellent knowledge of NoSQL platforms like Hbase, MongoDb, Cassandra etc.
  • Good knowledge of database technology with hands on experience on databases such as Oracle, SQL Server etc
  • Self-starter, with a keen interest in technology and highly motivated towards success
  • Excellent oral and written communication, presentation, and analytical skills

Job Type: Contract

Experience:

  • Java/Scala: 6 years (Required)
  • Big Data: 3 years (Preferred)

Sr. QlikView/QlikSense Developer

1 month ago
Chantilly, VAInfomatics Corp

Description

We are an OLTP and Business Intelligence team, building automated data delivery systems in AWS for a Federal client.

The Business Intelligence Developer will support the needs of the business through design and development of actionable analytics, development and maintenance of the BI platform as well as assist in the development of an enterprise analytical and self-service/ad-hoc reporting environment

Responsibilities

· Manipulating Data using technical specs
· Building the Code utilizing Qlik View & Qlik Sense BI (Business Intelligence) Virtualization Software & Data from SQL Backend
· Use the QlikSense scripting engine to develop and publish re-usable QVDs while applying best-practice QVD modeling techniques.
· Use QlikSense to deliver best-in-class dashboards and visualizations to the business
· Ensure all Qlik development is in line with the best practices, principals, and standards defined by the guidelines and complies with the firm’s overall data control standards.
· Query, analyze and transform complex data sets to optimize dashboard flexibility and performance

The ideal candidate must have

· 5-7 years of business intelligence experience
· Strong experience working within an agile development environment and experience in standard project methodologies.
· QlikView: QV server administration experience and familiarity with QV management console
· QlikSense: Experience optimizing data architecture and data libraries for specific use in QlikSense
· Qlik NPrinting: Ability to build new reporting solutions leveraging NPrinting application
· Advanced skills optimizing Qlik data models and directly developing Qlik dashboards. Qlik-specific skills must include:
o Load script development experience: Ability to load data from multiple source
o Experience using scripts to transform data for various dashboard requirements
o Advanced knowledge using set analysis & variables
o Experience developing QlikView user interfaces: Creating charts, filter boxes, list boxes, text objects, etc. Ability to apply and customize formatting using style sheets
o Familiarity with section access
· Extensive data modeling experience
· 7-10 years of total experience within Business Intelligence, Analytics, Data Management, and/or Management Consulting roles.
· Attention to detail a must with focus on executive quality presentation
· Excellent communication and client partnership ability; ability to bridge the gap between business users and technical subject matter experts
· Bachelor’s degree required; Quantitative / Technical discipline preferred (e.g. Computer Science, Analytics, Economics, Statistics)
· Ability to update and write SQL code required to extract data from source systems
· Prior experience devising and instituting predictive analytical frameworks
· Strong Database/Java coding/design skills
· Expertise with ETL procedures

Desired skills

Prior experience with Kafka, Openshift, GITLab,

Job Type: Full-time

Education: Bachelor's (Required)

The kind of people we look for:

· Versatile thinkers who thrive on variety and challenge
· Innate problem solvers who want to grow in a flexible, collaborative culture
· Engaging leaders who make a positive impact on their firm, clients, and communities
· Individuals who can work independently with no supervision that can contribute towards organizational success

"Infomatics Corp is an Equal Opportunity Employer. All qualified applicants will be considered for employment without regard to race, color, sex, religion, age, physical or mental disability, veteran status, citizenship status or any other status protected by federal, state or local law."

Job Types: Full-time, Contract

Experience:

  • total: 7 years (Preferred)

Education:

  • Bachelor's (Preferred)