Remote
0
USD
/hr

18 remote Hadoop contracts

Appian Developer (Remote)

25 days ago
RemoteICF

Remote or client site in Washington, IT Modernization division is a rapidly growing, entrepreneurial, technology department, seeking Appian Developers to support upcoming needs with our federal customers. Our IT Modernization division is an information technology and management consulting department that offers integrated, strategic solutions to its public and private-sector clients. ICF has the expertise, agility, and commitment to design, build, and operate high-performance IT engines to support all aspects of our client’s business.

ICF is a Primary Partner in the federal space: We are one of the largest federally focused Appian practices with 85-90 trained consultants and a deep center of excellence, with solid best practices. ICF will pay for certifications and put you through ITG University (our training portal). Employees enjoy various training based on job level and skillset, other training and study guides for various certification areas and in person training hosted by Appian.

Come work and learn with us!

Required Skills and Qualifications:

  • 1+ years of recent hands-on experience with Appian BPM

  • 1+ years of experience developing Appian Plugin 

  • Good working knowledge of Application Servers like JBoss, Weblogic, IIS etc. as part of the installation.

  • Experience configuring, debugging, and systems integration including configuring forms, reports, underlying logic, and interface components as a developer.

  • 1+ experienced with relational databases and SQL scripting.

  • 1+ years of experience in all phases of software development for large-scale business critical applications following Agile methodologies.

  • 1+ years of experience in the design and implementation of SOAP and REST Web Service.

  • Must be able to work with users to gather and refine requirements.

  • Green Card Holder or US Citizenship required due to federal contract requirements.

  • Desired Experience:

  • Certified Level 2 Appian Developer or Certified Level 3 Appian Developer

  • Experience working with Appian Tempo.

  • Java developer experience is a strong plus.

  • Recent work experience on a software development project in a Federal government setting.

  • Exposure to common industry platforms and programming languages – Appian BPM, IBM WebSphere, Mule, LAMP/JBOSS, HADOOP, Java, Microsoft/.Net is preferred.

  • Experience in Enterprise Application integration (SOA, ESB) and n-tier client-server architectures preferred

  • Experience working with Appian Sites.

  • Java developer with Maven builds experience is a strong plus.​

  • Excellent analytical and technical skills. 

  • Excellent written and verbal communication skills.

  • Exceptional interpersonal skills, including teamwork, facilitation and negotiation.

  • Get new remote Hadoop contracts sent to you every week.
    Subscribed to weekly Hadoop alerts! 🎉

    Scala Expert Developer/Engineer

    19 days ago
    RemoteNext Ventures
  • Practice Development & Integration

  • Technologies Design Development Skills

  • Scala Expert Developer/Engineer – Netherlands – Start ASAP – Remote work

    Short contract with extensions possible.

    We are currently looking for Developer (Scala/Pyspark, Kafka)

    Job Purpose and primary objectives:

    Key responsibilities (please specify if the position is an individual one or part of a team): 
    The associate should have good knowledge on Big data/Hadoop echo systems, Kafka , Scala, Pyspark

    Key Skills/Knowledge:

    Primary Skill- Looking for developer should have knowledge of Data Collector, Data Collector Edge, Banking Domain Knowledge, Big Data ,Kafka, Scala, Pyspark, Hadoop

    Experience required:

    Primary Skill- Looking for developer should have knowledge of Data Collector, Data Collector Edge, Banking Domain Knowledge, Big Data ,Kafka, Scala, Pyspark, Hadoop


    DevOps Engineer

    1 day ago
    £520 - £640/day (wellpaid.io estimate)RemoteInside IR35Lorien

    DevOps Engineer

    London(Remote Working)

    Inside IR35

    6 Months Contract


    Our London based banking client are currently offering an exciting opportunity for a DevOps Engineer to join their team on an initial 6 month contract.


    The successful candidate will have knowledge and experience with the following:

    • Unix, Scripting 
    • Automation – Puppet, Ansible, Team City, XL Release 
    • Platforms – Containers / K8S, Pivotal Cloud Foundry 
    • APIs, Rest Services, API Gateways, Java / Springboot MicroServices 
    • Security – FSSO/OAuth/SAML/JWT and Certificatates 
    • Hadoop/MongoDB/Elastic Search


    If this role is of interest, please apply and I will call you to discuss further.

    BIG DATA DEVELOPER - REMOTE (USC/GC)

    2 days ago
    RemoteCollabera
    Job Title:Big Data Developer - Sr/Mid-Level

    Location:RemoteDuration:6 Months Contract to Hire

    Note :: USC/GC only on W2


    Start: ASAP


    Must Have:

    VERY HANDS ON!

    Senior Position- Wants 5 years of Solid experience in the technologies



    • AWS
    • Big Data- EMR(spark), Hadoop, Scala
    • Python
    • Lambda automation/Orestration
    • Note** really looking for a candidate with AWS experience that is mostly in AWS and working in EMR



    Plus:



    • Health care background
    • AWS certification
    • Coming from large companies


    Day to Day: Team helps buildBig data pipeline- Navigates Data Warehouse



    • 70-80% will be coding
    • Agile environment- daily scrum, 1 a week sprint planning,
    • Less meetings and mostly working on code
    • Performance knowing Scala and Spark
    • Work on Lambda automation. Orestration





    AWS, develop,Big Data Other,Spark,scala,hadoop,aws,amazon web services,python,lambda,orchestration,big data technology,aws certified,aws certification

    AWS Cloud Data Engineer

    19 days ago
    £580 - £710/day (wellpaid.io estimate)RemoteNextLink Group

    Job Description

    Immediate hiring! AWS Cloud Data Engineer - London, UK - NextLink Solutions

    Next Link Solution is a Swiss IT Consultancy Company that has been providing both IT services and IT staffing to major customers for the past 20 years, has an excellent opportunity that can take your career to the next level.

    I am currently working with the world's largest retail client who are looking for a Cloud Engineer/Architect for a 6-12 month contract (possible extension) to work for one of our global insurance clients.

    General Information:

    Start date: ASAP

    Location: London, UK

    Duration: 6-12 months contract, Extension: possible

    Language: English - mandatory

    Remote: Remote working to start with, and on-site once the COVID-19 situation improves and we all start to return to our offices.

    TECHNICAL/BUSINESS SKILLS & KNOWLEDGE:

    The successful candidate for this position will have the following experiences and expertise:

  • Software engineering skills (coding best practices, CI/CD, testing)
  • Working with big data using (PySpark), SQL (e.g. transforming large datasets, incremental datasets etc.)
  • Programming - Python and Pyspark
  • Hadoop cluster, AWS Glue and Redshift
  • Assess the size & complexity of existing components, which will help product owner to make the right decision in modernisation or migration of DWH & BI layer
  • Source discovery exercise to bring the data from Data producer to DCL
  • Source discovery exercise for bring the history data into AWS datalake
  • Build the data pipeline using Pyspark and Python language from Api Source to CDAP - Data lake adhering to principles
  • FullStack Engineer with Security Clearance

    2 days ago
    RemoteESimplicity Inc.
    Please explore the opportunity to join us as a Sr. FSR Engineer - Active Top Secret Clearance. Who we are: Launched in 2016, eSimplicity is creating an elegantly connected, digitally enhanced and well-communicated world. The team delivers Healthcare IT, Cybersecurity, and Telecommunications solutions that improve the life and health of millions of Americans while defending our national interests on the battlefield. Our Digital Services engineers are building modern systems that help improve healthcare quality, expand care coverage, and lower cost, including fighting fraud, waste and abuse. Our Telecommunications engineering team helps the DoD implement spectrum (Radio Frequency) management on both national and international stages. We also have a cleared engineering team that helps the Department of Homeland Security protect our national security interests. The Federal Emerging Technology & Consulting Hub featured us as a company that continues to reinvent themselves, to invest, to challenge the status quo, and that is poised to make a lasting impact in Federal IT and Consulting in 2020 and years to come. Requirements
    ACTIVE TOP SECRET CLEARANCE REQUIRED

    ***Open to candidates on the East Coast that can work remotely and don't mind traveling for meetings and onboarding. Should be within a few hours train or car ride from DC.*** Keywords: Postgres, Hadoop, Spark, Typescript, React, AWS, Java, Python, Javascript Our core company platform provides the foundation for our projects. Custom applications built on top of our core company platform. Oracle, Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing.
    Java and Groovy for our back-end applications and data integration tools.
    Typescript, React/Redux for our web technologies.
    Python for data processing and analysis.
    Commercial standard cloud infrastructures such as AWS or Azure.
    CentOS/Red Hat Linux as the basis for all our on premise servers. Skill sets: Strong engineering background, preferred in fields such as Computer Science, Mathematics, Software Engineering, Physics.
    Familiarity with data structures, storage systems, cloud infrastructure, front-end frameworks, and other technical tools.
    Understanding of how technical decisions impact the user of what youre building.
    Proficiency with programming languages such as Java, C++, Python, JavaScript, or similar languages.
    Ability to work effectively in teams of technical and non-technical individuals.
    Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
    Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision.
    Willingness and interest to travel as needed.
    (As required by U.S. Government contract) U.S. citizenship

    SAS Developer

    4 days ago
    £425/dayRemoteCandidate Source Ltd
    This is an ideal opportunity for SAS Developer who will be responsible for engaging business analysts to understand data and reporting requirements to support business needs. You will provide data support for end-user analytics and business intelligence, report definitions and designs, data validation testing and data management. The ideal candidate for this position will have strong Business Intelligence and Data Analyst skills coupled with advanced SAS experience.

    You will take responsibility for data analysis, business analytics, data standardisation and management, data warehousing requirements, application data requirements, report design, development and testing of business requirements.

    The rate on offer is £425 per day on a 4–6-month contract which is likely to be extended. You would be required to work Monday to Friday from 09:00–17:30. This role would also be working remotely but may occasionally require you to visit one of the offices in the future.

    As an SAS Developer joining this project you will:

    - Remediate the customers who received unfair outcomes in the bank's collections process. Any Fees, Interests & Charges during the journeys are refunded back to the customers
    - Build and maintain a data model to extract various account & customer flags to apply correct treatment for the customer.  
    - Build calculation models to determine the redress pay out for each type of treatments. 
    - Process the files and keep an audit trail of the activities. 
    - Build MI dashboards. 

    To be considered for the role of SAS Developer you will possess the following attributes and abilities:

    - Advanced programming experience mainly working on SAS Enterprise Guide, SAS Macros and BASE SAS. 
    - Strong experience with SQL. 
    - Experience working on visualisation tools like Power BI or Tableau (good to have but not essential). 
    - Experience of working with a variety of databases, including Teradata, Oracle, SQL Server, Hadoop etc., Prior experience with at least one database essential.
    - Good understanding of the data warehousing concepts. 
    - Beneficial to have prior experience in financial domain (banking preferred).
    - Good analytical skills and quick learner. 

    You will also be required to complete a Criminal Records Bureau and Equifax Credit check for this project.

    To apply for this role as SAS Developer, please click apply online and upload an updated copy of your CV. 

    Candidate Source Ltd is an advertising agency. Once you have submitted your application it will be passed to the third party Recruiter who is responsible for processing your application. This will include holding and sharing your personal data, our legal basis for this is legitimate interest subject to your declared interest in a job. Our privacy policy can be found on our website and we can be contacted to confirm who your application has been forwarded to.

    Senior Python Developer

    1 day ago
    RemoteGenuent

    Job Description

    Genuent is hiring a Senior Python Developer. This would be a fully REMOTE long term contract opportunity located in Houston, TX. If this is something you might be interested in, please send your updated resume to Mike Sabo: MSabo@genuent.com 

    Requirements:

  • 5+ years of experience in Data and Software Engineering Disciplines – regardless of degree
  • Technically AgileComfortable in Windows and Linux environmentsEasily adopts and not adverse to new technologyAble to rapidly develop in technologies not previously used and to deliver in a timely manner
  • Experience with Hadoop + platformCloudera, MapR , etc..
  • Working knowledge and professional experience with Apache Spark , Apache Kafka , and other Big Data tools
  • Language Experience in Python and Java/Scala C++, R, JavaScript a plus
  • Must be proficient in OO Software Design
  • Experience Developing Microservices
  • Development of Real-time Pipelines & data modeling
  • Must be comfortable working in Linux and Windows
  • Easily learn and owns new languages
  • Comfortable working on polyglottic projects as the norm
  • Working at pace, independently and as a team
  • AGILE Scrum
  • Using tools from XP (e.g. Pair Programming )
  • Okay Working with Newer developers (Mentoring as needed)
  • Desired Skills:
  • Working knowledge of REST
  • Experience with a Source Control tool such as Git or TFSPreferably GIT with GitLab experience
  • Understanding of CI/CD
  • Kubernetes/Docker
  • GCP, Azure, etc...
  • Scientific/Client background or experience
  • Productization and Productionalization ExperienceEsp. when concerning scientific applications
  • GUI Development experienceJavaScript preferred
  • Principal Software Engineer (Partial Remote - Java, MapReduce)

    9 days ago
    RemoteGliaCell Technologies

    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud / Big Data (Batch and Streaming Analytics), CNO/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages.  For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Principal Software Engineer for a role on one of our subcontracts.  This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract. 

    Location: Annapolis Junction, MD

    Description: The Software Engineer develops, maintains, and enhances complex and diverse software systems (e.g., processing-intensive analytics, novel algorithm development, manipulation of extremely large data sets, real-time systems, and business management information systems) based upon documented requirements. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.

    Requirements:

    • Fourteen (14) years experience as a SWE in programs and contracts of similar scope, type, and complexity is required. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor’s degree. 
    • Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such as JSON and/or BSON 
    • Analyze user requirements to derive software design and performance requirements 
    • Design and code new software or modify existing software to add new features 
    • Debug existing software and correct defects 
    • Integrate existing software into new or modified systems or operating environments 
    • Develop simple data queries for existing or proposed databases or data repositories 
    • Provide recommendations for improving documentation and software development process standards 
    • Develop or implement algorithms to meet or exceed system performance and functional standards 
    • Assist with developing and executing test procedures for software components 
    • Write or review software and system documentation 
    • Develop software solutions by analyzing system performance standards, confer with users or system engineers; analyze systems flow, data usage and work processes; and investigate problem areas 
    • Serve as team lead at the level appropriate to the software development process being used on any particular project 
    • Modify existing software to correct errors, to adapt to new hardware, or to improve its performance 
    • Design, develop and modify software systems, using scientific analysis and mathematical models to predict and measure outcome and consequences of design 
    • Design or implement complex database or data repository interfaces/queries 
    • Oversee one or more software development teams and ensure the work is completed in accordance with the constraints of the software development process being used on any particular project 
    • Design or implement complex algorithms requiring adherence to strict timing, system resource, or interface constraints; Perform quality control on team products 
    • Confer with system engineers and hardware engineers to derive software requirements and to obtain information on project limitations and capabilities, performance requirements and interfaces 
    • Coordinate software system installation and monitor equipment functioning to ensure operational specifications are met 
    • Implement recommendations for improving documentation and software development process standards

    Desired: 

    • Java, MapReduce, Pig, Cloud experience, GhostMachine, QTA, and Hadoop 

     

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

     

    To Apply for this Position: Respond to this job posting and attach an updated resume.

     

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    Senior Software Engineer (Partial Remote - Cloud, MapReduce/Pig)

    9 days ago
    RemoteGliaCell Technologies

    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud/Big Data (Batch and Streaming Analytics), CNO/CND/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages.  For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Senior Software Engineer for a role on one of our subcontracts.  This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract. 

    Location: Annapolis Junction, MD

    Qualifications:

    • Seven (7) years experience as a SWE, in programs and contracts of similar scope, type, and complexity is required. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor’s degree.

    Desired: 

    • Cloud development experience
    • MapReduce and/or Pig experience.
    • Ability to debug software and correct defects

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

     

    To Apply for this Position: Respond to this job posting and attach an updated resume.

     

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    Principal Software Engineer (Partial Remote - Spark, Accumulo, Analytics)

    9 days ago
    RemoteGliaCell Technologies

    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud / Big Data (Batch and Streaming Analytics), CNO/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages.  For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Principal Software Engineer for a role on one of our subcontracts.  This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract. 

    Location: Annapolis Junction, MD

    Description:

    The Software Engineer develops, maintains, and enhances complex and diverse software systems (e.g., processing-intensive analytics, novel algorithm development, manipulation of extremely large data sets, real-time systems, and business management information systems) based upon documented requirements. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.

    Requirements:

    • Fourteen (14) years experience as a SWE in programs and contracts of similar scope, type, and complexity is required. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor’s degree.
    • Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such as JSON and/or BSON
    • Analyze user requirements to derive software design and performance requirements
    • Design and code new software or modify existing software to add new features
    • Debug existing software and correct defects
    • Integrate existing software into new or modified systems or operating environments
    • Develop simple data queries for existing or proposed databases or data repositories
    • Provide recommendations for improving documentation and software development process standards
    • Develop or implement algorithms to meet or exceed system performance and functional standards
    • Assist with developing and executing test procedures for software components
    • Write or review software and system documentation
    • Develop software solutions by analyzing system performance standards, confer with users or system engineers; analyze systems flow, data usage and work processes; and investigate problem areas
    • Serve as team lead at the level appropriate to the software development process being used on any particular project
    • Modify existing software to correct errors, to adapt to new hardware, or to improve its performance
    • Design, develop and modify software systems, using scientific analysis and mathematical models to predict and measure outcome and consequences of design
    • Design or implement complex database or data repository interfaces/queries
    • Oversee one or more software development teams and ensure the work is completed in accordance with the constraints of the software development process being used on any particular project
    • Design or implement complex algorithms requiring adherence to strict timing, system resource, or interface constraints; Perform quality control on team products
    • Confer with system engineers and hardware engineers to derive software requirements and to obtain information on project limitations and capabilities, performance requirements and interfaces
    • Coordinate software system installation and monitor equipment functioning to ensure operational specifications are met
    • Implement recommendations for improving documentation and software development process standards

    Desired:

    • Spark streaming and batch analytic development experience
    • Accumulo experience
    • Experience tuning large scale spark analytics
    • Java and Spring development experience
    • HPC integration and performance tuning experience

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

     

    To Apply for this Position: Respond to this job posting and attach an updated resume.

     

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    Senior Java Developer - ( Google Cloud Platform / Kubernetes)

    19 days ago
    £525/dayRemoteInside IR35IT Talent Solutions Ltd
    • Senior Backend Java Developer
    • Remote working
    • Inside IR35 (umbrella company required)

    Looking for a company that inspires passion, courage and imagination, where you can be part of the team shaping the future of e-commerce? Want to shape how millions of people buy, sell, connect, and share around the world? If you’re interested in joining a purpose driven community that is dedicated to creating an ambitious and inclusive workplace, a company you can be proud to be a part of, our client is one of the UK’s biggest classifieds community, they are seeking a talented Senior BackEnd Java Developer to join our Engineering team.

    The business is currently working on migrating their kubernetes services from Private Cloud, Open Stack, Debian VMs to containerised solutions in Google Cloud Platform and Google Kubernetes Engine.

    We’re looking for an experienced Java developer with knowledge of containerisation, Kubernetes and Google Cloud Platform to help us complete a successful migration.

    We are seeking expertise in several of the following key skills:

    • Java 11
    • Spring framework
    • JUnit 5
    • OpenAPI
    • Microservices
    • Google Cloud
    • Google Kubernetes Engine
    • Helmchart
    • Docker
    • Jenkins
    • Git
    • Memcached
    • Continuous Delivery

    Additionally, experience of some of these technologies is desirable:

    • Java Feign, Spring MVC Tests
    • Python
    • TDD and BDD
    • OpenStack
    • ELK
    • Grafana, Pact, Rest easy, Wiremock, Mockito
    • Scala, Scalatest and Webdriver
    • Kafka
    • MongoDB
    • Hadoop
    • Redis
    • Akka
    • Discovery
    • Dual-Track Agile

    Responsibilities

    • Design, develop, test and release microservices on GKE/GCP
    • Containerise existing microservices to run on GKE/GCP
    • Work with Operations to automatically deploy microservices to production
    • Optimise legacy code to enable a successful migration of each microservice to GKE/GCP
    • Help decoupling code from existing monoliths into microservices
    • Share knowledge and best practice in Brownbag sessions to ramp up less experienced engineers
    • Produce Production Ready code that is clean and testable
    • Work as part of an agile team

    Software Integration Engineer (Partial Remote - DevOps, SA, Compliance)

    9 days ago
    RemoteGliaCell Technologies

    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud/Big Data (Batch and Streaming Analytics), CNO/CND/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages.  For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Software Integration Engineer for a role on one of our subcontracts.  This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract.

    This position allows for remote work up to 50%.

    Location: Annapolis Junction, MD

    Qualifications:

    • 14+ years of Software Engineer experience and a BS Degree in Computer Science (or related)
    • This position is 10% Software Engineering and 90% Software Integration 

    Desired:

    • Sourcing for a candidate that has the ability/willingness to work a DevOps position and do parts of system development (development, test, SA, Compliance)
    • Working knowledge of Rancher and Keycloak, as well as experience in a methodical approach to problem-solving and automation

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

     

    To Apply for this Position: Respond to this job posting and attach an updated resume.

     

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    DevOps Engineer

    11 days ago
    RemoteApex Systems

    Apex Systems the 2nd largest IT Staffing firm in the nation is seeking an experienced DevOps Engineer to join our client’s team. This is a  W2 contract position  is slated for 6 months with possibility for extension/conversion and is FULLY REMOTE (PST hours).

    **Must be comfortable sitting on Apex System's W2**

    Job Description:

    We are on a mission to connect every member of the global workforce with economic opportunity, and that starts right here. Talent is our number one priority, and we make sure to apply that philosophy both to our customers and to our own employees as well. Explore cutting-edge technology and flex your creativity. Work and learn from the best. Push your skills higher. Tackle big problems. Innovate. Create. Write code that makes a difference in professionals’ lives.

    Gobblin is a distributed data integration framework that was born at client and was later released as an open-source project under the Apache foundation. Gobblin is a critical component in client's data ecosystem, and is the main bridge between the different data platforms, allowing efficient data movement between our AI, analytics, and member-facing services. Gobblin utilizes and integrates with the latest open source big data technologies, including Hadoop, Spark, Presto, Iceberg, Pinot, ORC, Avro, and Kubernetes. Gobblin is a key piece in client's data lake, operating at a massive scale of hundreds of petabytes.

    Our latest work involves integrations with cutting edge technologies such as Apache Iceberg to allow near-real-time ingestion of data from various sources onto our persistent datasets that allow complex and highly scalable query processing for various business logic applications, serving machine-learning and data-science engineers. Furthermore, we play an instrumental role in client's transformation from on-prem oriented deployment to Azure cloud-based environments. This transformation prompted a massive modernization and rebuilding efforts of Gobblin, transforming it from a managed set of Hadoop batch jobs to an agile, auto-scalable, real-time streaming oriented PaaS, with user-friendly self-management capabilities that will boost productivity across our customers. This is an exciting opportunity to take part in shaping the next generation of the platform.

    What is the Job

    You will be working closely with development and site reliability teams to better understand their challenges in aspects like:

    Increasing development velocity of data management pipelines by automating testing and deployment processes,

    Improving the quality of data management software without compromising agility.

    You will create and maintain fully-automated CI/CD processes across multiple environments and make them reproducible, measurable, and controllable for data pipelines that deal with PBs every day. With your abundant skills as a DevOps engineer, you will also be able to influence the broad teams and cultivate DevOps culture across the organization.

    Why it matters

    CI/CD for big data management pipelines have been a traditional challenge for the industry. This is becoming more critical as we evolve our tech stack into the cloud age (Azure). With infrastructure shifts and data lake features being developed/deployed at an ever fast pace, our integration and deployment processes must evolve to ensure the highest-quality and fulfill customer commitments. The reliability of our software greatly influences the analytical workload and decision-making processes across many company-wide business units, the velocity of our delivery plays a critical role to transform the process of mining insights from massive-scale Data Lake into an easier and more efficient developer productivity paradigm.

    What You’ll Be Doing

  • Work collaboratively in an agile, CI/CD environment
  • Analyze, document, and implement and maintain CI/CD pipelines/workflows in cooperation with the data lake development and SRE teams
  • Build, improve, and maintain CI/CD tooling for data management pipelines
  • Identify areas for improvement for the development processes in data management teams
  • Evangelize CI/CD best practices and principles
  • Technical Skills

  • Experienced in building and maintaining successful CI/CD pipelines
  • Self-driven and independent
  • Has experience with Java, Scala, Python or other programming language
  • Great communication skills
  • Master of automation
  • Years of Experience

  • 5+
  • Preferred Skills

  • Proficient in Java/Scala
  • Proficient in Python
  • Experienced in working with:
  • Big Data environments: Hadoop, Kafka, Hive, Yarn, HDFS, K8S
  • ETL pipelines and distributed systems
  • DevOps Engineer

    1 month ago
    RemoteApex Life Sciences.

    Apex Systems the 2nd largest IT Staffing firm in the nation is seeking an experienced DevOps Engineer to join our client’s team. This is a  W2 contract position  is slated for 6 months with possibility for extension/conversion and is FULLY REMOTE (PST hours).

    **Must be comfortable sitting on Apex System's W2**

    If you are interested send all qualified resumes to Nathan Castillo (Professional Recruiter with Apex Systems) at Ncastillo@apexsystems.com! 

    Job Description:

    We are on a mission to connect every member of the global workforce with economic opportunity, and that starts right here. Talent is our number one priority, and we make sure to apply that philosophy both to our customers and to our own employees as well. Explore cutting-edge technology and flex your creativity. Work and learn from the best. Push your skills higher. Tackle big problems. Innovate. Create. Write code that makes a difference in professionals’ lives.

    Gobblin is a distributed data integration framework that was born at client and was later released as an open-source project under the Apache foundation. Gobblin is a critical component in client's data ecosystem, and is the main bridge between the different data platforms, allowing efficient data movement between our AI, analytics, and member-facing services. Gobblin utilizes and integrates with the latest open source big data technologies, including Hadoop, Spark, Presto, Iceberg, Pinot, ORC, Avro, and Kubernetes. Gobblin is a key piece in client's data lake, operating at a massive scale of hundreds of petabytes.

    Our latest work involves integrations with cutting edge technologies such as Apache Iceberg to allow near-real-time ingestion of data from various sources onto our persistent datasets that allow complex and highly scalable query processing for various business logic applications, serving machine-learning and data-science engineers. Furthermore, we play an instrumental role in client's transformation from on-prem oriented deployment to Azure cloud-based environments. This transformation prompted a massive modernization and rebuilding efforts of Gobblin, transforming it from a managed set of Hadoop batch jobs to an agile, auto-scalable, real-time streaming oriented PaaS, with user-friendly self-management capabilities that will boost productivity across our customers. This is an exciting opportunity to take part in shaping the next generation of the platform.

    What is the Job

    You will be working closely with development and site reliability teams to better understand their challenges in aspects like:

    Increasing development velocity of data management pipelines by automating testing and deployment processes,

    Improving the quality of data management software without compromising agility.

    You will create and maintain fully-automated CI/CD processes across multiple environments and make them reproducible, measurable, and controllable for data pipelines that deal with PBs every day. With your abundant skills as a DevOps engineer, you will also be able to influence the broad teams and cultivate DevOps culture across the organization.

    Why it matters

    CI/CD for big data management pipelines have been a traditional challenge for the industry. This is becoming more critical as we evolve our tech stack into the cloud age (Azure). With infrastructure shifts and data lake features being developed/deployed at an ever fast pace, our integration and deployment processes must evolve to ensure the highest-quality and fulfill customer commitments. The reliability of our software greatly influences the analytical workload and decision-making processes across many company-wide business units, the velocity of our delivery plays a critical role to transform the process of mining insights from massive-scale Data Lake into an easier and more efficient developer productivity paradigm.

    What You’ll Be Doing

  • Work collaboratively in an agile, CI/CD environment
  • Analyze, document, and implement and maintain CI/CD pipelines/workflows in cooperation with the data lake development and SRE teams
  • Build, improve, and maintain CI/CD tooling for data management pipelines
  • Identify areas for improvement for the development processes in data management teams
  • Evangelize CI/CD best practices and principles
  • Technical Skills

  • Experienced in building and maintaining successful CI/CD pipelines
  • Self-driven and independent
  • Has experience with Java, Scala, Python or other programming language
  • Great communication skills
  • Master of automation
  • Years of Experience

  • 5+
  • Preferred Skills

  • Proficient in Java/Scala
  • Proficient in Python
  • Experienced in working with:
  • Big Data environments: Hadoop, Kafka, Hive, Yarn, HDFS, K8S
  • ETL pipelines and distributed systems
  • Software Engineer (C# .Net)

    1 day ago
    RemoteNICE Ltd

    The rubber hits the road when someone picks up the phone to talk to a company. It happens over 250 million times a day. With over 60 percent of people dumping brands because of bad customer service, that phone call can make or break a business.

    Enter NICE Mattersight – the only company that uses personality analysis and big data analytics to improve every customer call, increase customer satisfaction and lower costs for Fortune 500 enterprises. Our patented technology mines a private database of over 1 billion customer service calls to quickly pair customers with call center agents with whom they will naturally and effortlessly click. Awkward and annoying becomes satisfying and enjoyable. Frustrated becomes enthusiastic.

    NICE Mattersight helps companies make positive conversations with their customers the rule, not the exception, by fostering emotional connections that turn complainers into fans.

    Who you are:
    NICE Mattersight is seeking a Software Developer to join our Routing Analytics Team. The ideal candidate is someone who is excited about working in a data-driven environment on applications built to mine, ship, store, analyze, and utilize big data.

    We are looking for someone who is passionate about software development, has a strong desire to share knowledge with others, wants to keep abreast of industry best practices, and can work remotely as a member of a distributed (US-based) team. Candidates should know or be willing to learn multiple languages and environments ranging from Kubernetes, AWS Serverless, VMware, .NET Core, Python, and Hadoop, and candidates should be passionate about continuous integration and continuous delivery with an eye towards weekly deployments. We would love to hire someone who wants to become an expert in .NET and related technologies, while being able to contribute meaningfully on our Python applications as needed. This is a great opportunity for an experienced C# developer to get experience in big data! 


    What you’ll do:

    For this position the successful candidate will perform design, development, testing, and sustaining engineering tasks within the Microsoft .NET Core framework using C#, as well as in Python and Spark scripts within a Hadoop cluster. The candidate will also work with Kubernetes, Jenkins, Helm, and related technologies in their day-to-day work. This position requires proficiency with unit testing and the related tools (XUnit, Postman, Newman, etc)

  • Write and maintain Microsoft .NET and .NET Core software running in Kubernetes.
  • Write and maintain Python apps and Spark scripts running in Hadoop.
  • Create and maintain Jenkins-based software build pipelines.
  • Deploy software to our Container infrastructure leveraging Docker, Kubernetes, Helm, AWS, and other cloud-based technologies.
  • Deploy software to our Hadoop cluster leveraging Ansible
  • Consider emerging technologies/innovations when helping to design software and architecture.
  • Participate as a contributor in an Agile Scrum team
  • Participate in a rotating on-call schedule to aid our operations team when needed.
  • What you bring to the table:

  • 3+ years with solid foundation in OOP, Distributed Microservice Architecture and/or SOLID principles.
  • Proven experience in C# and .Net Core
  • Experience with API Design and REST principles
  • Experience with agile software development methodology or willingness to learn and adopt it.
  • Experience with CI/CD (continuous integration and continuous delivery) build pipelines and Jenkins.
  • Experience with testing frameworks (like Chaos, API contract, and performance testing) and the critical relationship these have with a stable CI/CD pipeline
  • What differentiates you as the best:

  • Experience with the latest software design, testing, and software delivery tools
  • Experience with container-based software delivery and related management and monitoring applications.
  • Experience working with in a Software as a Service (SaaS) organization
  • Experience with RabbitMQ or similar MQ services
  • Excitement to work not just on .NET applications but with Python as well
  • Work well in a remote distributed environment
  • Passion for true Continuous Integration and Continuous Delivery. We strive to deliver quality fast
  • Dependable - you own what you develop; our scrum teams own their product and delivery
  • Ultimately we want someone who is creative, flexible and has a passion for delivering cutting edge technology
  • Sponsorship currently not offered

    NICE Systems is an Equal Opportunity/Affirmative Action Employer, M/F/D/V.


    Software Engineer

    1 day ago
    RemoteNICE

    The rubber hits the road when someone picks up the phone to talk to a company. It happens over 250 million times a day. With over 60 percent of people dumping brands because of bad customer service, that phone call can make or break a business.

    Enter NICE Mattersight – the only company that uses personality analysis and big data analytics to improve every customer call, increase customer satisfaction and lower costs for Fortune 500 enterprises. Our patented technology mines a private database of over 1 billion customer service calls to quickly pair customers with call center agents with whom they will naturally and effortlessly click. Awkward and annoying becomes satisfying and enjoyable. Frustrated becomes enthusiastic.

    NICE Mattersight helps companies make positive conversations with their customers the rule, not the exception, by fostering emotional connections that turn complainers into fans.

    Who you are:
    NICE Mattersight is seeking a Software Developer to join our Routing Analytics Team. The ideal candidate is someone who is excited about working in a data-driven environment on applications built to mine, ship, store, analyze, and utilize big data.

    We are looking for someone who is passionate about software development, has a strong desire to share knowledge with others, wants to keep abreast of industry best practices, and can work remotely as a member of a distributed (US-based) team. Candidates should know or be willing to learn multiple languages and environments ranging from Kubernetes, AWS Serverless, VMware, .NET Core, Python, and Hadoop, and candidates should be passionate about continuous integration and continuous delivery with an eye towards weekly deployments. We would love to hire someone who wants to become an expert in .NET and related technologies, while being able to contribute meaningfully on our Python applications as needed. This is a great opportunity for an experienced C# developer to get experience in big data! 


    What you’ll do:

    For this position the successful candidate will perform design, development, testing, and sustaining engineering tasks within the Microsoft .NET Core framework using C#, as well as in Python and Spark scripts within a Hadoop cluster. The candidate will also work with Kubernetes, Jenkins, Helm, and related technologies in their day-to-day work. This position requires proficiency with unit testing and the related tools (XUnit, Postman, Newman, etc)

  • Write and maintain Microsoft .NET and .NET Core software running in Kubernetes.
  • Write and maintain Python apps and Spark scripts running in Hadoop.
  • Create and maintain Jenkins-based software build pipelines.
  • Deploy software to our Container infrastructure leveraging Docker, Kubernetes, Helm, AWS, and other cloud-based technologies.
  • Deploy software to our Hadoop cluster leveraging Ansible
  • Consider emerging technologies/innovations when helping to design software and architecture.
  • Participate as a contributor in an Agile Scrum team
  • Participate in a rotating on-call schedule to aid our operations team when needed.
  • What you bring to the table:

  • 3+ years with solid foundation in OOP, Distributed Microservice Architecture and/or SOLID principles.
  • Proven experience in C# and .Net Core
  • Experience with API Design and REST principles
  • Experience with agile software development methodology or willingness to learn and adopt it.
  • Experience with CI/CD (continuous integration and continuous delivery) build pipelines and Jenkins.
  • Experience with testing frameworks (like Chaos, API contract, and performance testing) and the critical relationship these have with a stable CI/CD pipeline
  • What differentiates you as the best:

  • Experience with the latest software design, testing, and software delivery tools
  • Experience with container-based software delivery and related management and monitoring applications.
  • Experience working with in a Software as a Service (SaaS) organization
  • Experience with RabbitMQ or similar MQ services
  • Excitement to work not just on .NET applications but with Python as well
  • Work well in a remote distributed environment
  • Passion for true Continuous Integration and Continuous Delivery. We strive to deliver quality fast
  • Dependable - you own what you develop; our scrum teams own their product and delivery
  • Ultimately we want someone who is creative, flexible and has a passion for delivering cutting edge technology
  • Sponsorship currently not offered

    NICE Systems is an Equal Opportunity/Affirmative Action Employer, M/F/D/V.


    DevOps Engineer (Partial Remote - Git, AWS, SA, Docker, Ansible, Kibana)

    9 days ago
    RemoteGliaCell Technologies

    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud/Big Data (Batch and Streaming Analytics), CNO/CND/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages.  For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a DevOps Engineer for a role on one of our subcontracts.  This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract. This position allows for remote work up to 50%.

    Location: Annapolis Junction, MD

    Qualifications:

    • 20+ years of experience and a BS Degree in Computer Science (or related)

    Desired:

    • Sourcing for a flexible, motivated candidate with experience in Gitlab (Highly desirable), AWS (Highly Desirable), Some SA skill sets (IAVA, STE), Postgresql, Pratetor, Ruby, Git and GitLab-ci
    • Applications: Elasticsearch with Machine Learning, Gitlab (eVo and Cyber Forge soon, 3 Gitlabs in "Production"), Mattermost, Custom Authentication code, NGINX, Prometheus, Graphana, Kibana, pwick, TerraForm, Ansible and Docker
    • Architecture: Multiple Auto Scale groups, S3 curation, Snapshots across VPC, RDS, Elasticache, Network Load Balancers, Custom AMI, AWS

     

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

     

    To Apply for this Position: Respond to this job posting and attach an updated resume.

     

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR