Remote
0
USD
/hr

28 remote Big Data contracts

Big Data Engineer - Inside IR35

16 days ago
£300 - £335/dayRemoteInside IR35Spring Technology
Big Data Engineer
6 months (Inside IR35)
Ipswich (remote working and onsite)
£300 - £335 per day

The role:
Performing a multi-skilled role you'll have experience building, deploying and managing data ingestion pipelines using Cloudera (HDFS, Spark, Kafka, Hive, Impala, HBase) and Elasticsearch technologies

Skills required:
- Experience with Puppet, Jenkins and with ETL tools such as NiFi
- Knowledge of version control e.g. GitLab and Linux essential
- Cloudera - HDFS, Spark, Kafka, Hive, Impala, HBASE
- Elasticsearch/Kibana
- Pipeline release automation using Puppet, Jenkins, GITLAB or similar
- Experience working in a Linux environment (including system administration fundamentals)

Spring Technology acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers. The Spring Group UK is an Equal Opportunities Employer.

By applying for this role your details will be submitted to Spring. Our Candidate Privacy Information Statement explaining how we will use your information is available on our website.
Get new remote Big Data contracts sent to you every week.
Subscribed to weekly Big Data alerts! 🎉

Big Data Engineer (Java or Scala) – REMOTE WORKING

1 month ago
£500 - £610/day (wellpaid.io estimate)RemoteStaff Worx
Big Data Engineer (Java Scala Azure Spark). New digital disruptor looking for experienced Java Big Data Engineers for greenfield programme.

Hadoop/Big Data Developer(W2,OPT and CPT's can apply) Remote...

21 days ago
$44 - $68/hourRemoteFord Motor Company
Job details
Salary
$44 - $68 an hour
Job Type
Full-time
Contract
Number of hires for this role
10+
Full Job Description

Description:

As a Systems Engineer on the Mach1 Ops team, you will be at the forefront of delivering some truly valuable machine learning insights and analytics solutions. We are looking for someone who has a passion for supporting and enabling our customers (Data Scientists, Data Engineers and Software Developers) create solutions. That is helping them find ways to reduce the toil of data science development, evolving the automation of maintaining our systems, and building new and more effective monitoring capabilities. In this role, you will work closely with multiple organizations (Business Customers, Product Owners, Global Data and Analytics and IT) to convert business goals into tangible solutions, prioritizing them through Agile user stories on our product backlog.

Skills Required:

Strong communication and customer relationship skills Strong Skills in Python, Kubernetes, Hadoop, Docker and Linux Passion for teaching and enabling Data Science professionals to understand and apply your technical knowledge Strong organizational skills and experience managing activities across multiple diverse teams Knowledge of Predictive Analytics and/or Machine Learning concepts and applying them to Manufacturing, Finance, Quality Control and Marketing and Sales situations Knowledge of Monitoring tools and techniques (DynaTrace, ELK Stack, developing synthetic monitoring)

Skills Preferred:
*submit candidates in Talent Match Only

Experience Required:

Strong communication and customer relationship skills Strong Skills in Python, Kubernetes, Hadoop, Docker and Linux Passion for teaching and enabling Data Science professionals to understand and apply your technical knowledge Strong organizational skills and experience managing activities across multiple diverse teams Knowledge of Predictive Analytics and/or Machine Learning concepts and applying them to Manufacturing, Finance, Quality Control and Marketing and Sales situations Knowledge of Monitoring tools and techniques (DynaTrace, ELK Stack, developing synthetic monitoring)

Education Required:

Bachelor "s or related experience

Job Types: Full-time, Contract

Pay: $44.00 - $68.00 per hour

Schedule:

  • 8 hour shift

Contract Length:

  • More than 1 year

Work Location:

  • One location

Work Remotely:

  • No

COVID-19 Precaution(s):

  • Remote interview process

Speak with the employer
+91 248-239-8004

Hiring : Data Engineer : RI/Remote

22 days ago
RemoteFlairTech Solutions
Experience performing data analysis and data exploration. Experience building automated data pipelines. Experience with big data frameworks (i.e. Hadoop and…

Software Engineer - .Net

8 days ago
RemoteBrooksource

Software Engineer (.NET – Azure)

Contract (2 years+)

Fully Remote

 

As a Software Engineer, you will be building and enhancing a new exciting customer loyalty platform to help drive additional consumer sales, customer stickiness, and loyalty within their stores. This mobile application will leverage cutting edge technology to improve processing speed of Big Data thus improving the overall customer experience of using the mobile application. Technologies and processes required to build the solution include; .NET framework, C#, SQL, use of APIM through Azure, Conceptualization of Big Data architecture, deployment through DevOps methodology, and code builds through Agile sprints. If you are interested in working in a booming industry with long-term investment in building modern solutions to improve the customer journey, please apply today!

 

Requirements:

  • Experience programming within a .NET environment, C#, APIM tools, SQL Server (2012+)
  • Experience building and deploying applications in a cloud environment (Azure)
  • Experience working with Javascript, CSS, HTML, and other UI languages and tools
  • Experience delivering applications in an Agile environment
  • Experience working in the digital space is a plus


Responsibilities

  • Will manage, build, deploy, and unit test a modern dynamic mobile application that will be put into production to be used in business store environment by consumers
  • Expected to work hand in hand with the BA’s on project to help architect and define technical requirements
  • Will work in collaboration with other vendor partners to help build the solution
  • Expected to publicly communicate and guide strategy of build to leadership, peers, and relevant stakeholders in formal/informal settings
  • Expected to work with core project team to develop a defined POC by year end 2020 to move forward with full scale production build
  • Will have to develop and foster relationships with other teams to help promote milestone achievement upstream/downstream of the build



Brooksource provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws

Senior Software Developer, Cloud & Linux experience, full time position

17 hours ago
RemoteDovel Technologies

We are searching for a Senior Software Developer with Linux and Cloud experience to support the National Center for Biotechnology Information (NCBI) is part of the National Library of Medicine (NLM) at the National Institutes of Health (NIH) in North Bethesda, MD. NCBI is the world’s premier biomedical center hosting over six million daily users that seek research, clinical, genetic, and other information that directly impacts biomedical research and public health – at NCBI you can literally help to accelerate cures for diseases! NCBI’s wide range of applications, platforms (Node, Python, Django, C++, you name it) and environments (big data [petabytes], machine learning, multiple clouds) serve more users than almost any other US Government Agency according to https://analytics.usa.gov/.

 

This is a full-time position with full benefits. You will start out working remotely until we beat Covid. We are looking for those IT professionals that enjoy having a passion for what they do and want to work for an organization that gives back to the community. That's what a position at the National Institutes of Health (NIH) will give you is an opportunity to contribute to the betterment of America's public health system. Be on the cutting edge as discoveries are made to improve your countries health!


The Senior Software Developer with Linux and Cloud platform experience will work on solutions to support continued development of NCBI’s dbGaP database. This database is the world premier archive of assembled and annotated sequence data and is a part of international collaboration that includes archives in Europe and Japan.  SeqDB makes biological sequence data submitted by the scientific community available to researchers worldwide, enhancing reproducibility and allowing for new discoveries. SeqDB is a large resource, comprised of over 1.6 billion records and 6 trillion DNA basepairs, and handles requests at rates of up to 50,000/second. The future development of SeqDB will involve re-architecting of the backend sequence databases, including exploration of Cloud-based strategies for sequence access. NCBI - National Center for Biotechnology Information is part of the National Library of Medicine (NLM) at National Institutes of Health (NIH).   NCBI serves over 4 million daily users in search of clinical, genetic and other information that directly impacts biomedical research and public health and is the world's top 3 most-visited site in the science category.

 

The Senior Software Developer with Linux and Cloud experience will be responsible for the development, implementation, testing, and continued maintenance of NCBI's dbGaP database and bioinformatics pipelines

  • Work with a diverse group of Scientists, Bioinformaticians and other Developers across the center to implement efficient bioinformatics algorithms and solutions for data storage and delivery
  • Facilitate development of cloud-ready tools and pipelines to support sustainable scalability and enable use by collaborators and scientists in the field
  • Learn new technologies, keep up with the internal systems, and share your knowledge with the team
  • Provide expertise to assist other developers in design and development of new solutions

Qualifications:

  • 5+ years of experience handling large amounts of data or working in a distributed computing environment
  • Programming experience in a Linux environment
  • Experience with Cloud technologies:
    • AWS: EC2, S3, Lambda
    • GCP: GKE, Google Store, Cloud functions
  • Proficiency in at least one of programming languages: Java, C, C++
  • Fluency in some scripting languages such as BASH, csh, Perl, or Python
  • Ability to work with common structured documents (at least one of XML, JSON)
  • Experience with CI/CD pipelines, unit tests, integration and regression testing
  • Solid communication skills both oral and written
  • Please be aware that samples of code will be asked for

Desired qualifications:

  • Experience with working with genetic and biological data
  • Experience with relational databases
  • Experience with no-SQL databases such as Cassandra
  • Experience with open source projects and involvement in open source communities such as GitHub, etc.
  • Experience managing production workflow of an online public database
  • Experience with RESTful API design
  • Adept at Agile techniques and practices

#LI-LM1



We are a trusted government partner that blends deep domain expertise with advanced technologies to help our customers solve complex problems that improve, protect, and save lives. As a rapidly growing company, we combine entrepreneurial spirit, customer focus, and an outcomes-based approach to support agency missions in health IT, life sciences, public safety, and grants management.

 

The Dovel Family of Companies offers employees an opportunity to advance beyond a specific role or contract, we offer a path to develop an enriching career. We believe in empowering a culture of innovation, customer success, and employee growth. 

 

What you’ll get…

  • Time Off! Flexible schedules and company paid holidays allow you to take the time you need.
  • Investment in YOU! 401(K) company contributions are yours to keep with no waiting period.
  • Choices! Unique healthcare plans to choose from with options like fertility and orthodontia benefits.
  • Discovery! With our tuition assistance and training programs, we support your career advancement.
  • Tax Savings! Enroll in pre-tax Health or Dependent Care Flexible Spending, HSA with company contributions, parking, and/or transit commuter benefits.
  • Support! Working parents and busy professionals – we’ve got you covered with a supportive culture, confidential Employee Assistance Program, and membership to Care.com.
  • Perks! Employee discounts, peer recognition programs, company-wide wellness challenges, and fun community events.
  • A Voice! A unique culture where you can influence decisions and have your voice heard.

 

We are an Equal Opportunity Employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, gender identity, disability, or veteran status.

CLEARED Systems Engineer 1 - UE/UX (50% TELEWORK!)

15 days ago
RemoteEITR Technologies, LLC
Systems Engineer 1 - UE/UX (50% TELEWORK!)
 
EITR Technologies, LLC provides exceptional engineering and consulting services across industries. Our team has a wealth of experience cultivating an outstanding reputation in the Intelligence Community, Commercial Sector, and Open Source Community. We boast a large breadth of expertise in cloud technologies, automation, big data, analytics, networking, and security. These skills combine to provide comprehensive solutions to our client's most complex problems.
 
Location: Annapolis Junction, MD
 
Position Type: Full-Time
 
Minimum Clearance: TS/SCI with full-scope polygraph

This position allows for remote work up to 50%!! (mission dependent)

This role is supporting a set of complementary,  internally developed software products that optimize common analyst workflows related to target management and their identifiers. This TTO is seeking a system engineer with solid UE/UX design skills and experience that is also familiar with analyst facing tools and the intelligence production chain.

Seeking individuals that have:
  • Experience working with mission customers to collect requirements and representing them as visual mockups.
  • Experience working with design tools such as Axure, DX, Balsamiq, Adobe Illustrator or similar tools to create mockups or wireframes
  • Experience with Agile development and tools that support development processes: Confluence, JIRA, Teams or other related tools.
  • Experience working with developers to implement designs and turning mockups into product capabilities for users
  • Experience supporting a production application and collecting feedback from users on production capabilities post deployment

Description:
  • Analyzes user's requirements, the concept of operations documents, and high-level system architectures to develop system requirements specifications.
  • Analyzes system requirements and leads design and development activities.
  • Guides users in formulating requirements, advises alternative approaches and conducts feasibility studies.
  • Provides technical leadership for the integration of requirements, design, and technology.
  • Incorporates new plans, designs, and systems into ongoing operations.
  • Develops technical documentation.
  • Develops system architecture and system design documentation.
  • Guides system development and implementation planning through assessment or preparation of system engineering management plans and system integration and test plans.
  • Interacts with the Government regarding Systems Engineering technical considerations and for associated problems
  • Ultimately responsible for the technical integrity of work performed and deliverables associated with the Systems Engineering area of responsibility.
  • Communicates with other program personnel, government overseers, and senior executives.

 
 
Position Functions:
  • Contribute to the development of sections of systems engineering documentation such as System Engineering Plans, Initial Capabilities Documents, Requirements Specifications, and Interface Control Documents
  • Manage system requirements and derived requirements to ensure the delivery of production systems that are compatible with the defined system architecture(s) - Department of Defense Architecture Framework (DoDAF), Service-oriented Architecture (SOA), etc
  • Assist with the development of system requirements, functional requirements, and allocation of the same to individual hardware, software, facility, and personnel components
  • Coordinate the resolution of action items from Configuration Control Board (CCB) meetings, design reviews, program reviews, and test reviews that require cross-discipline coordination
  • Participate in an Integrated Product Team to design new capabilities based upon the evaluation of all necessary development and operational considerations
  • Participate in the development of system engineering documentation, such as System Engineering Plans, Initial Capabilities Documents, Requirements Specifications, and Interface Control Documents
  • Participate in interface definition, design, and changes to the configuration between affected groups and individuals throughout the life cycle
  • Allocate real-time process budgets and error budgets to systems and subsystem components
  • Derive from the system requirements an understanding of stakeholder needs, functions that may be logically inferred and implied as essential to system effectiveness
  • Derive lower-level requirements from higher-level allocated requirements that describe in detail the functions that a system component must fulfill, and ensure these requirements are complete, correct, unique, unambiguous, realizable, and verifiable
  • Generate alternative system concepts, physical architectures, and design solutions
  • Participate in establishing and gaining approval of the definition of a system or component under development (requirements, designs, interfaces, test procedures, etc.) that provides a common reference point for hardware and software developers
  • Define the methods, processes, and evaluation criteria by which the systems, subsystems and work products are verified against their requirements in a written plan
  • Develop system design solution that satisfies the system requirements and fulfills the functional analysis
  • Develop derived requirements for Information Assurance Services (Confidentiality, Integrity, Nonrepudiation, and Availability); Basic Information Assurance Mechanisms (e.g., Identification, Authentication, Access Control, Accountability); and Security Mechanism Technology (Passwords, cryptography, discretionary access control, mandatory access control, hashing, key management, etc.)
  • Review and provide input to program and contract work breakdown structure (WBS), work packages, and the integrated master plan (IMP)
     
 
 
Qualifications: 
Seven (7) years experience as a SE in programs and contracts of similar scope, type, and complexity is required. Bachelor's degree in System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related discipline from an accredited college or university is required.  Five (5) years of additional SE experience may be substituted for a bachelor's degree.
 


Benefits:
  • Premium medical benefits program through Carefirst BlueChoice as well as 100% company payment of medical deductible through HSA contribution
  • Dental, Vision, Short Term Disability, and Long Term Disability, as well as Life Insurance benefits paid in full by company
  • Extensive Paid Time Off (PTO) program consisting of 30 days (6 weeks). This covers sick, vacation, and holiday pay
  • 6% 401(k) gifted with immediate vestment
  • Continuing to add perks as the company grows

We are proud to be an Affirmative Action/Equal Opportunity Employer. EITR provides equal employment opportunity for all persons, in all facets of employment.

Data Analyst (SAS & Python ) - Charlotte, NC - 100 % Remote

18 days ago
RemoteeTeam


Job Title: Application Development - Data Analyst (SAS &
Python )
Location: Charlotte, NC
Duration: 3 Months Contract Role
General Information
Job Description: 100% remote; Local candidates only; resource will pick up equipment onsite
Shift Start/End: 9am - 5pm ET
US Bank is looking for a successful Data Quality & Testing Analyst to support our Enterprise Data Governance (EDG) initiatives focused on our team's capabilities around data quality, data analytics and end-to-end and user acceptance testing. This role will collaborate with the business line and technology groups.
Primary responsibilities include
:• Effectively work in matrixed team of Data Quality Analysts/Data Scientists in support of the development, end to end testing and delivery of the EDG's capabilities to address a specific business and/or regulatory need.
• Research and build a detailed understanding of the problem and related data assets in order to code the data processing and analysis.
• Light development coding
• Ensure that the data used follows the compliance, access management and control policies of the company while meeting the data governance requirements of the business.
• Work with technical groups to support the collection, integration and retention of the data sources.
• Apply data visualization and summarization techniques to the analytical results.
• Interpret and communicate the results in a manner that is understood by the business.
Basic Qualifications
:• Bachelor's or Master’s Degree in Mathematics, Engineering, Computer Science or equivalent work experience.
• 5+ years related technical experience and 3+ years analytical experience in the Banking/Financial industry or similar highly regulated industry.
• SQL, end to end Testing and Python experience
Preferred Skills/Experience
• Proficiency with SQL, SAS and Python with focus on data analysis packages like pandas
• Experience participating in end-to-end and User acceptance testing and/ or Post Production validation of the data
• Knowledge of Spark and relational databases
• Experience with CI/CD tools like Jenkins
• Knowledge of public cloud platforms.
• Intermediate knowledge of application coding and development lifecycle.
• Analytical: (e.g. comfortable with data) and can combine an understanding of business objectives, customer needs, and data required to deliver customer experience and business results.
• Intermediate knowledge of data governance, data management, data architecture, data modelling concepts and data governance tooling and the complexities of data in a large financial and/or highly regulated institution.
• Willingness to continuously develop and acquire new technical skills, learn new tools & programming languages and Big Data techniques.
• Proven ability to adjust quickly to shifting priorities, multiple demands, ambiguity and rapid change.
��� Effective oral and written communication with the ability for analyzing, slicing and dicing data while deriving significant insights.
• Natural curiosity and self-directed ability to seek out information and meet goals/deadlines.
• Agile experience: Very comfortable following Agile Scrum methodology.
Analysis, and on-going support of internally developed processes to support business initiatives. Automation of processes using existing tool set, including Microsoft BI Stack (SSRS, SSIS, SSAS, Excel Services, Performance Point, and PowerPivot). Analysis and query writing to assist in supporting key business initiatives.
COMMUNICATION SKILLS
: This position requires good oral and written communication skills to interface with internal business clients - many having minimal technical knowledge. Ability to translate business requests into technical requirements.
TECHNICAL SKILLS
: Ability to write transact-sql code to extract correct data to answer business questions. SQL Development experience. Extensive experience with creating reports using Microsoft Sql Server Reporting Services.
ADDITIONAL BENEFICIAL EXPERIENCE
: use and construction of Microsoft Sql Server Analysis Services; Sharepoint (using the BI Stack). Strong attention to detail for validation of all queries constructed. Ability to visualize data into a form that effectively answers the business question.
Preference for candidates with SQL, Python & testing experience located in Charlotte, NC. Resource will pick up equipment onsite.

Java Developer - Medical Device Firm - (Full Remote)

1 month ago
RemoteReal Staffing

Real Staffing is currently working on 2 Java Developer roles for a leading Medical Device firm based in Pittsburgh, PA. The role is fully remote - working Eastern Standard hours.

They are looking for experienced Java Developers to work on a next generation data analytics platform for their advance sleep monitoring devices. The role is project critical for the firm and the team is looking to fill the opening by end of February.

Essential Duties and Responsibilities:

  • Minimum 3-5 years Core Java development experience
  • Experience writing Enterprise Java Applications using Microservices / Service Oriented Architecture and REST
  • Experience with Spring and Springboot
  • Hands on experience of Big data management and analytic tools such as Elasticsearch & Kibana
  • Experience in Database development: SQL, NoSQL, MySQL
  • Worked in an Agile/Scrum development process environment
  • Development experience writing code to create single-threaded, multi-threaded or user interface event driven applications, either stand-alone and those which access servers or service is a preferred.

    These roles are open to contract to hire or long term contracting. Paying market leading rates (depending on experience)

    If you are interested, please apply with your latest resume and our team will give you a call!

    Thanks,

    Sthree US is acting as an Employment Business in relation to this vacancy.

    Senior Solution Architect - AWS - AI - Remote - Outside IR35

    2 days ago
    £460 - £570/day (wellpaid.io estimate)RemoteOutside IR35ApplyGateway

    This is a fantastic opportunity to work as a Senior Solutions Architect on a remote contract, outside IR35 and working on a high profile project.

    The key skills required for this Senior Solutions Architect position are:

    • AI and MLOps
    • High performance computing
    • Big data management and analysis
    • AWS cloud experience.
    • Imaging informatics (desirable)

    If you do have the relevant experience for this Senior Solution Architect role, please do apply.


    Role: Senior Solution Architect - AWS - AI - Remote - Outside IR35
    Job Type: Contract
    Location: England,

    Apply for this job now.

    Contract AWS Machine Learning Engineer - Remote

    14 days ago
    RemoteClearScale
    Clients come to us for our deep experience with Big Data, Containerization, Serverless Infrastructure, Microservices, IoT, Machine Learning, DevOps and more.

    Scala Developer. (Functional) 6 months Contract. Remote. £600 a day Inside IR35

    8 hours ago
    £600/dayRemoteInside IR35ApplyGateway

    2 Contract Scala Developers wanted. £600 a day (Inside). 6 months rolling. Remote

    We're looking for 2 Contract Scala Developers for a leading media client in West London paying £600 a day (Inside) on a rolling 6 month contract. Our client are expanding a team of Scala developers working on personalisation and API projects. If interested, please see more details below:

    Location: Fully remote but when things return to normal it will be a couple of days in the office a week (West London) but some flex on this.

    Project: Customisation/APIs

    Experience required:

    • 4+ years of experience developing in Scala (functional - not big data development)
    • Strong TDD/BDD and unit testing skills
    • Experience building highly scalable systems and/or exposure to high TPC (Transactions per second)
    • Familiarity with continuous integration and delivery practices

    Desirable:

    • Experience with Play or Akka
    • Docker
    • Gatling

    Interview process: 1 stage 1.5 hour Technical Q&A (no pairing exercise or take home test!)

    If interested, please get in touch with your CV

    Thanks

    Julian

    Xpertise Recruitment


    Role: Scala Developer. (Functional) 6 months Contract. Remote. £600 a day Inside IR35
    Job Type: Contract
    Location: London,

    Apply for this job now.

    DevOps Engineer

    7 days ago
    RemoteApex Life Sciences.

    Apex Systems the 2nd largest IT Staffing firm in the nation is seeking an experienced DevOps Engineer to join our client’s team. This is a  W2 contract position  is slated for 6 months with possibility for extension/conversion and is FULLY REMOTE (PST hours).

    **Must be comfortable sitting on Apex System's W2**

    If you are interested send all qualified resumes to Nathan Castillo (Professional Recruiter with Apex Systems) at Ncastillo@apexsystems.com! 

    Job Description:

    We are on a mission to connect every member of the global workforce with economic opportunity, and that starts right here. Talent is our number one priority, and we make sure to apply that philosophy both to our customers and to our own employees as well. Explore cutting-edge technology and flex your creativity. Work and learn from the best. Push your skills higher. Tackle big problems. Innovate. Create. Write code that makes a difference in professionals’ lives.

    Gobblin is a distributed data integration framework that was born at client and was later released as an open-source project under the Apache foundation. Gobblin is a critical component in client's data ecosystem, and is the main bridge between the different data platforms, allowing efficient data movement between our AI, analytics, and member-facing services. Gobblin utilizes and integrates with the latest open source big data technologies, including Hadoop, Spark, Presto, Iceberg, Pinot, ORC, Avro, and Kubernetes. Gobblin is a key piece in client's data lake, operating at a massive scale of hundreds of petabytes.

    Our latest work involves integrations with cutting edge technologies such as Apache Iceberg to allow near-real-time ingestion of data from various sources onto our persistent datasets that allow complex and highly scalable query processing for various business logic applications, serving machine-learning and data-science engineers. Furthermore, we play an instrumental role in client's transformation from on-prem oriented deployment to Azure cloud-based environments. This transformation prompted a massive modernization and rebuilding efforts of Gobblin, transforming it from a managed set of Hadoop batch jobs to an agile, auto-scalable, real-time streaming oriented PaaS, with user-friendly self-management capabilities that will boost productivity across our customers. This is an exciting opportunity to take part in shaping the next generation of the platform.

    What is the Job

    You will be working closely with development and site reliability teams to better understand their challenges in aspects like:

    Increasing development velocity of data management pipelines by automating testing and deployment processes,

    Improving the quality of data management software without compromising agility.

    You will create and maintain fully-automated CI/CD processes across multiple environments and make them reproducible, measurable, and controllable for data pipelines that deal with PBs every day. With your abundant skills as a DevOps engineer, you will also be able to influence the broad teams and cultivate DevOps culture across the organization.

    Why it matters

    CI/CD for big data management pipelines have been a traditional challenge for the industry. This is becoming more critical as we evolve our tech stack into the cloud age (Azure). With infrastructure shifts and data lake features being developed/deployed at an ever fast pace, our integration and deployment processes must evolve to ensure the highest-quality and fulfill customer commitments. The reliability of our software greatly influences the analytical workload and decision-making processes across many company-wide business units, the velocity of our delivery plays a critical role to transform the process of mining insights from massive-scale Data Lake into an easier and more efficient developer productivity paradigm.

    What You’ll Be Doing

  • Work collaboratively in an agile, CI/CD environment
  • Analyze, document, and implement and maintain CI/CD pipelines/workflows in cooperation with the data lake development and SRE teams
  • Build, improve, and maintain CI/CD tooling for data management pipelines
  • Identify areas for improvement for the development processes in data management teams
  • Evangelize CI/CD best practices and principles
  • Technical Skills

  • Experienced in building and maintaining successful CI/CD pipelines
  • Self-driven and independent
  • Has experience with Java, Scala, Python or other programming language
  • Great communication skills
  • Master of automation
  • Years of Experience

  • 5+
  • Preferred Skills

  • Proficient in Java/Scala
  • Proficient in Python
  • Experienced in working with:
  • Big Data environments: Hadoop, Kafka, Hive, Yarn, HDFS, K8S
  • ETL pipelines and distributed systems
  • DevOps Engineer

    8 days ago
    RemoteApex Systems

    Apex Systems the 2nd largest IT Staffing firm in the nation is seeking an experienced DevOps Engineer to join our client’s team. This is a  W2 contract position  is slated for 6 months with possibility for extension/conversion and is FULLY REMOTE (PST hours).

    **Must be comfortable sitting on Apex System's W2**

    Job Description:

    We are on a mission to connect every member of the global workforce with economic opportunity, and that starts right here. Talent is our number one priority, and we make sure to apply that philosophy both to our customers and to our own employees as well. Explore cutting-edge technology and flex your creativity. Work and learn from the best. Push your skills higher. Tackle big problems. Innovate. Create. Write code that makes a difference in professionals’ lives.

    Gobblin is a distributed data integration framework that was born at client and was later released as an open-source project under the Apache foundation. Gobblin is a critical component in client's data ecosystem, and is the main bridge between the different data platforms, allowing efficient data movement between our AI, analytics, and member-facing services. Gobblin utilizes and integrates with the latest open source big data technologies, including Hadoop, Spark, Presto, Iceberg, Pinot, ORC, Avro, and Kubernetes. Gobblin is a key piece in client's data lake, operating at a massive scale of hundreds of petabytes.

    Our latest work involves integrations with cutting edge technologies such as Apache Iceberg to allow near-real-time ingestion of data from various sources onto our persistent datasets that allow complex and highly scalable query processing for various business logic applications, serving machine-learning and data-science engineers. Furthermore, we play an instrumental role in client's transformation from on-prem oriented deployment to Azure cloud-based environments. This transformation prompted a massive modernization and rebuilding efforts of Gobblin, transforming it from a managed set of Hadoop batch jobs to an agile, auto-scalable, real-time streaming oriented PaaS, with user-friendly self-management capabilities that will boost productivity across our customers. This is an exciting opportunity to take part in shaping the next generation of the platform.

    What is the Job

    You will be working closely with development and site reliability teams to better understand their challenges in aspects like:

    Increasing development velocity of data management pipelines by automating testing and deployment processes,

    Improving the quality of data management software without compromising agility.

    You will create and maintain fully-automated CI/CD processes across multiple environments and make them reproducible, measurable, and controllable for data pipelines that deal with PBs every day. With your abundant skills as a DevOps engineer, you will also be able to influence the broad teams and cultivate DevOps culture across the organization.

    Why it matters

    CI/CD for big data management pipelines have been a traditional challenge for the industry. This is becoming more critical as we evolve our tech stack into the cloud age (Azure). With infrastructure shifts and data lake features being developed/deployed at an ever fast pace, our integration and deployment processes must evolve to ensure the highest-quality and fulfill customer commitments. The reliability of our software greatly influences the analytical workload and decision-making processes across many company-wide business units, the velocity of our delivery plays a critical role to transform the process of mining insights from massive-scale Data Lake into an easier and more efficient developer productivity paradigm.

    What You’ll Be Doing

  • Work collaboratively in an agile, CI/CD environment
  • Analyze, document, and implement and maintain CI/CD pipelines/workflows in cooperation with the data lake development and SRE teams
  • Build, improve, and maintain CI/CD tooling for data management pipelines
  • Identify areas for improvement for the development processes in data management teams
  • Evangelize CI/CD best practices and principles
  • Technical Skills

  • Experienced in building and maintaining successful CI/CD pipelines
  • Self-driven and independent
  • Has experience with Java, Scala, Python or other programming language
  • Great communication skills
  • Master of automation
  • Years of Experience

  • 5+
  • Preferred Skills

  • Proficient in Java/Scala
  • Proficient in Python
  • Experienced in working with:
  • Big Data environments: Hadoop, Kafka, Hive, Yarn, HDFS, K8S
  • ETL pipelines and distributed systems
  • Full Stack Software Engineer - 179891

    8 days ago
    $52/hourRemoteHKA Enterprises

    PLEASE NOTE: THIS IS A W2 CONTRACT; NO 3RD PARTY RECRUITER SUBMITTALS PLEASE. THANK YOU IN ADVANCE FOR HONORING THE REQUIREMENTS.

    Job Order: 179891

    Functional Title: Full Stack Software Engineer - 179891

    Client Title: Full Stack Software Engineer

    Location: Greenville, SC, and Remote (See information below)

    PAY: $42.00 to $52.00 PER HOUR

    1st Shift: 40 hours per week-8 AM until 5 PM, Monday through Friday

    Contract: Long term

    Interview Type: Onsite or Video

    Hiring Bonus: $2000.00

    The Hiring Bonus is broken down as follows:

    You will receive $500 on the 1st paycheck

    $500 at 60 days

    $500 at 90 days

    $500 at 180 days.

    Interview Type: Video or In-Person (No travel expense offered by the client for an in-person interview. No relocation offered)

    90% REMOTE WORK. THE 1ST 2 TO 3 WEEKS WILL BE ONSITE DURING TRAINING. AFTER TRAINING, YOU WOULD BE REQUIRED TO REPORT TO THE SITE 1 DAY EVERY 2 WEEKS. IF THERE IS AN EMERGENCY, YOU WOULD NEED TO BE ABLE TO ARRIVE AT THE SITE WITHIN 2 HOURS.

    The position is an ongoing W2 contract. There is no end date projected.

    The position would be working for HKA at the client site Greenville, SC facility.

    HKA offers employee paid medical and 401k on the 1st of the month after your 1st day.

    40 hours of prorated vacation after six months.

    Six paid holidays after 30 days..FULL STACK SOFTWARE ENGINEER

    Required Skills:

    Frontend Technology: 1 of the 3 Frontend Technologies is required

    React=Front-end library that runs on a browser

    Angular=Angular is a platform and framework for building single-page client applications using HTML and TypeScript. Angular is written in TypeScript. It implements core and optional functionality as a set of TypeScript libraries that you import into your apps.

    Vue=Vue. js is a robust but simple JavaScript framework. It has one of the lowest barriers to entry of any modern framework while providing all the required features for high-performance web applications; a frontend web app and backend REST API server.

    Script Experience: 1 of the 4 scripting experiences is required.

    Node JS=Node.js is a server-side platform built on Google Chrome's JavaScript Engine-V8 Engine.

    TypesScript=Used to develop JavaScript applications for both client-side and server-side execution-as with Node. js or Deno.

    Java=Java is used as the server-side language for most back-end development projects, including those involving big data.

    .Net=Software development framework and ecosystem designed and supported by Microsoft to allow for easy desktop and web application engineering. It's a popular free platform currently used for many different types of applications as it provides the programming environment for most software development phases.

    Backend Development: 1 of the 4 Backend technology experiences is required.

    REST APIs=A RESTful API is an architectural style for an application program interface (API) that uses HTTP requests to access and use data. That data can be used to GET, PUT, POST, and DELETE data types, which refers to the reading, updating, creating, and deleting operations concerning resources.

    Event-Driven Services= With an event-driven system, the capture, communication, processing, and persistence of events is the solution's core structure. 

    Scheduled Services=Scheduled services are activities that are executed periodically. The schedule for each activity is set using the Task Scheduler within the Windows operating system.

    Serverless Functions=Serverless functions are event-driven, meaning the code is invoked only when triggered by request.

    Cloud Hosting Solutions: 1 of the 3 cloud hosting solutions is required.

    AWS=is a secure cloud services platform, offering computing power, database storage, content delivery, and other functionality to help businesses scale and grow. Running web and application servers in the cloud to host dynamic websites.

    Google=is a provider of computing resources for deploying and operating applications on the web. Its specialty is providing a place for individuals and enterprises to build and run the software, and it uses the web to connect to the users of that software.

    Azure=is a serverless solution that allows you to write less code, maintain less infrastructure, and save on costs. Instead of worrying about deploying and maintaining servers, the cloud infrastructure provides all the up-to-date resources needed to keep your applications running.

    Database Engines: 1 of the 3 DBA Engines is required.

    PostgreSQL= stable database management system used as the primary data store or data warehouse for many web, mobile, geospatial, and analytics applications.

    SQLServer= is proprietary software or an RDBMS tool that executes the SQL statements.

    Mongo= can coordinate and control changes to the structure of documents using schema validation; Classified as a NoSQL database program, MongoDB uses JSON-like documents with optional schemas. 

    Git: For tracking changes in any set of files, usually used for coordinating work among programmers collaboratively developing source code during software development. Its goals include speed, data integrity, and support for distributed, non-linear workflows. 

    Bonus Skills:

    Experience with CI/CD pipelines and Data Science

    Strong verbal + written communication skills

    Demonstrates ability to work well in team-oriented settings

    Communication skills (Verbal and Written)

    Position Scope:

    Designs and develops overall architecture for web application using leading-edge technologies and know-how from Industry, partner companies, local universities, and internal partners into automotive concepts. Integrates these concepts using project development methods and Integration processes. Identifies emerging technologies to build software and/or hardware prototypes and production-ready solutions. Focuses on areas including but not limited to web, DB, ETL, and other enabling technologies to develop and maintain quality, responsive applications. 

    Position Responsibilities:

    Designs and develops web applications.

    Defines and documents functionality through use cases, business process, flows, UI design, and UML modeling as necessary.

    Works on several development initiatives concurrently and provides subject matter expertise on customer implementations and product customization.

    Partners with other developers to develop functionality following existing style and coding standards.

    Reviews designs, demo prototypes and provides application support.

    Defines the visualization and realization of future technologies.

    Supports the complete process from development of concepts and vision to full production-ready solutions, which can be integrated rapidly into the automotive environment.

    Serves as a primary point of contact for other engineers and specialists in the team to provide expert knowledge and troubleshooting skills.

    Serves as an internal consultant to other developers and engineers as needed, assisting in all phases of product life-cycle development.

    Maintains accurate, meaningful, and updated technical and non-technical documentation of all aspects of area(s) of responsibility.

    Analyzes business-critical processes, evaluates and recommends improvements.

    Measures performance of delivered services through a set of agreed metrics and ensures appropriate actions are taken to meet all services agreements.

    Performs other duties as assigned by the Operations Supervisor. 

    Position Competencies:

    Education:

    BA/BS Degree in Business, Computer Science or Electrical Engineering preferred or the equivalent of 4 years professional IT-related experience.

    MS degree (preferred).

    Experience:

    5+ years of technical experience in Information Technology.

    2+ years of experience in web and DB development with demonstrated strengths in software and hardware design, development, integration, and testing.

    5+ years of experience with Object-oriented programming (Java, Objective C, or JavaScript).

    3+ years of experience implementing software design patterns and best practices applicable to web development.

    1+ years of experience developing enterprise or client-facing web applications.

    1+ years of experience deploying and supporting Web applications.

    1+ years of hands-on development experience with back-end programming languages (PHP, Python, Java, .NET, JavaScript, etc.).

    1+ years of experience with MySQL, MongoDB, Oracle.

    1+ years of experience interfacing with REST Web Services and JSON/XML.

    1+ years of experience monitoring and supporting performance of web applications and infrastructure.

    Basic knowledge of Linux Server administration, Version Control Systems.

    Licenses and/or Certifications:

    Process/project management experience or training/certifications (preferred but not required).

    Primary Work Location/Shift:

    90% REMOTE WORK. THE 1ST 2 TO 3 WEEKS WILL BE ONSITE DURING TRAINING. AFTER TRAINING, YOU WOULD BE REQUIRED TO REPORT TO THE SITE 1 DAY EVERY 2 WEEKS. IF THERE IS AN EMERGENCY, YOU WOULD NEED TO BE ABLE TO ARRIVE AT THE SITE WITHIN 2 HOURS.

     

    Senior Full Stack Engineer (Java/Python) - REMOTE

    20 hours ago
    $70 - $80/hourGreat rateRemoteRandstad
    job summary:
    Will help design and development of cross-functional, multi-platform application systems. Writing great quality code with a drive for automated testing and validation based on the You Build You Own (YBYO) model. Perform complex engineering activities for performance tuning, monitoring, deployment and production support. Research, influence and implement vendor dependency to simplify the architecture. Implement, maintain and update CI/CD pipelines on a cloud environment. Work with business partners, architects and other groups to identify technical and functional needs of systems based on priority. Collaborate with multiple, enterprise-wide distributed performing teams to deliver new capabilities in business applications. Design and develop API's for Omni-channel clients.


    location: LOS ANGELES, California
    job type: Contract
    salary: $70 - 80 per hour
    work hours: 8am to 5pm
    education: Bachelors

    responsibilities:
    Skills:

  • Minimum of 6+ years of software development experience
  • And Minimum of 4+ years of experience: Java, .NET, Python
  • And Minimum of 4+ years' experience with databases and data modeling / design (SQL and NoSQL)
  • And Minimum of 3+ years coordinating team efforts in a project or operations environment
  • Minimum of 2 years in full stack development for cloud solutions (Azure or AWS) - Azure preferred
  • 8+ years of experience in JavaScript development frameworks and tooling: Angular, React, VueJS, & Gulp, Grunt, Yarn, etc.
  • 8+ years of experience managing software development projects through complete release cycles
  • 4+ years of experience in big data and event streaming technologies: Spark, Kafka, etc. 4+ years of experience leading software engineering teams
  • 2+ years of experience in cloud technologies: Azure(big plus), AWS, OpenStack, etc.
  • 2+ years of experience with container and orchestration: Docker, Kubernetes, etc.
  • 2+ years of experience in build and CICD technologies: GitHub, BitBucket, Azure DevOps, Maven, Jenkins, Nexus or Sonar
  • 2+ years of Scaled agile experience
  • 4+ years of experience in quality assurance technologies: ATDD, Selenium, Cucumber, JUnit, NUnit, SoapUI or Postman
  • 4+ years of experience in Unix Shell scripting Certified in .Net, Java, Spring or cloud technologies
  • Exposure of data management methodologies
  • Experience with serverless architectures and computing
  • Preferably 2+ UI/UX development & design systems: CSS, Web Components, Less, Sass
  • Education:

    Bachelor's Degree or equivalent


    qualifications:
  • Experience level: Experienced
  • Minimum 5 years of experience
  • Education: Bachelors
  •  
    skills:
  • Python
  • Java
  • SQL
  • Azure
  • AWS
  • API (2 years of experience is required)

  • Data Architect - Remote - Cloud/PySpark/Java or Scala

    2 days ago
    RemoteInside IR35ApplyGateway

    Data Architect with cloud ideally GCP and PySpark experience is required for 6-month contract with a leading financial services organisation based in London. You will architect, design, estimate, developing and deploy cutting edge software products and services that leverage large scale data ingestion, processing, storage and querying, in-stream & batch analytics for Cloud and on-prem environments.

    THIS ROLE IS FULLY REMOTE AND INSIDE IR35

    Experience:

    • Extensive experience with Data related technologies, including knowledge of Big Data Architecture Patterns and Cloud services (AWS/Azure/GCP)
    • GCP experience is desirable (Big Query, Pub-Sub, Spanner)
    • Experience delivering end to end Big Data solutions on-premise and/or on Cloud
    • Knowledge of the pros and cons of various database technologies like Relational, NoSQL, MPP, Columnar databases
    • Expertise in the Hadoop eco-system with one or more distribution-like Cloudera and cloud-specific distributions
    • Proficiency in Java and Scala programming languages
    • Python experience
    • Expertise in one or more NoSQL database (Mongo DB, Cassandra, HBase, DynamoDB, Big Table etc.)
    • Experience of one or more big data ingestion tools (Sqoop, Flume, NiFI etc.), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub etc.)
    • Expertise with at least one distributed data processing framework eg Spark (Core, Streaming, SQL), Storm, Flink etc.
    • Knowledge of flexible, scalable data models addressing a wide variety of consumption patterns including random-access, sequential access including necessary optimisations like bucketing, aggregating, sharding
    • Knowledge of performance tuning, optimization and scaling solutions from a storage/processing standpoint
    • Experience building DevOps pipelines for data solutions, including automated testing

    Desirable:

    • Knowledge of containerization, orchestration and Kubernetes engine
    • An understanding of how to setup Big data cluster security (Authorization/Authentication, Security for data at rest, data in transit)
    • A basic understanding of how to manage and setup Monitoring and alerting for Big data clusters
    • Experience of orchestration tools - Oozie, Airflow, Ctr-M or similar
    • Experience of MPP style query engines like Impala, Presto, Athena etc.
    • Knowledge of multi-dimensional modelling like start schema, snowflakes, normalized and de-normalized models
    • Exposure to data governance, catalog, lineage and associated tools would be an added advantage
    • A certification in one or more cloud platforms or big data technologies
    • Any active participation in the Data Engineering thought community (eg blogs, key note sessions, POV/POC, hackathon)

    Role: Data Architect - Remote - Cloud/PySpark/Java or Scala
    Job Type: Contract
    Location: Not Specified,

    Apply for this job now.

    Software Integration Engineer (Partial Remote - DevOps, SA, Compliance)

    8 days ago
    RemoteGliaCell Technologies
    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud/Big Data (Batch and Streaming Analytics), CNO/CND/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages. For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Software Integration Engineer for a role on one of our subcontracts. This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract.

    This position allows for remote work up to 50%.

    Location: Annapolis Junction, MD

    Qualifications:

    • 14+ years of Software Engineer experience and a BS Degree in Computer Science (or related)
    • This position is 10% Software Engineering and 90% Software Integration
    Desired:

    • Sourcing for a candidate that has the ability/willingness to work a DevOps position and do parts of system development (development, test, SA, Compliance)
    • Working knowledge of Rancher and Keycloak, as well as experience in a methodical approach to problem-solving and automation
    Salary: Negotiable

    Resumes will be accepted until the position is filled.

    To Apply for this Position: Respond to this job posting and attach an updated resume.

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    arDmHfuTuy

    Principal Software Engineer (Partial Remote - Java, MapReduce)

    8 days ago
    RemoteGliaCell Technologies
    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud / Big Data (Batch and Streaming Analytics), CNO/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages. For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Principal Software Engineer for a role on one of our subcontracts. This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract.

    Location: Annapolis Junction, MD

    Description: The Software Engineer develops, maintains, and enhances complex and diverse software systems (e.g., processing-intensive analytics, novel algorithm development, manipulation of extremely large data sets, real-time systems, and business management information systems) based upon documented requirements. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.

    Requirements:

    • Fourteen (14) years experience as a SWE in programs and contracts of similar scope, type, and complexity is required. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor’s degree.
    • Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such as JSON and/or BSON
    • Analyze user requirements to derive software design and performance requirements
    • Design and code new software or modify existing software to add new features
    • Debug existing software and correct defects
    • Integrate existing software into new or modified systems or operating environments
    • Develop simple data queries for existing or proposed databases or data repositories
    • Provide recommendations for improving documentation and software development process standards
    • Develop or implement algorithms to meet or exceed system performance and functional standards
    • Assist with developing and executing test procedures for software components
    • Write or review software and system documentation
    • Develop software solutions by analyzing system performance standards, confer with users or system engineers; analyze systems flow, data usage and work processes; and investigate problem areas
    • Serve as team lead at the level appropriate to the software development process being used on any particular project
    • Modify existing software to correct errors, to adapt to new hardware, or to improve its performance
    • Design, develop and modify software systems, using scientific analysis and mathematical models to predict and measure outcome and consequences of design
    • Design or implement complex database or data repository interfaces/queries
    • Oversee one or more software development teams and ensure the work is completed in accordance with the constraints of the software development process being used on any particular project
    • Design or implement complex algorithms requiring adherence to strict timing, system resource, or interface constraints; Perform quality control on team products
    • Confer with system engineers and hardware engineers to derive software requirements and to obtain information on project limitations and capabilities, performance requirements and interfaces
    • Coordinate software system installation and monitor equipment functioning to ensure operational specifications are met
    • Implement recommendations for improving documentation and software development process standards
    Desired:

    • Java, MapReduce, Pig, Cloud experience, GhostMachine, QTA, and Hadoop

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

    To Apply for this Position: Respond to this job posting and attach an updated resume.

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    a3ZWLH0R3K

    Senior Software Engineer (Partial Remote - Cloud, MapReduce/Pig)

    12 days ago
    RemoteGliaCell Technologies
    Required Clearance: TS/SCI with Polygraph (TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE OR REINSTATABLE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE) (U.S. CITIZENSHIP REQUIRED)

    GliaCell Technologies specifically focuses on Software and Systems Engineering in the Cloud/Big Data (Batch and Streaming Analytics), CNO/CND/Reverse Engineering/Mobile Development, and we have tons of work involving Java, JavaScript, Python, C/C++, Node.js, React.js, Ruby, Hadoop, Spark, Kafka, Flink, NiFi, Groovy, Kubernetes, Docker, AWS and many more! As a niche company devoted to delivering elite technical support and resources in the Cloud and Cyberspaces, we have the ability to get our hands on some really interesting work, while being able to provide competitive salaries, 401K, and benefits packages. For more information, please visit www.gliacelltechnologies.com.

    GliaCell is currently seeking a Senior Software Engineer for a role on one of our subcontracts. This is a full-time position offering the opportunity to support a U.S. Government customer. The mission is to provide technical expertise that assists in sustaining critical mission-related software and systems to a large government contract.

    Location: Annapolis Junction, MD

    Qualifications:

    • Seven (7) years experience as a SWE, in programs and contracts of similar scope, type, and complexity is required. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor’s degree.
    Desired:

    • Cloud development experience
    • MapReduce and/or Pig experience.
    • Ability to debug software and correct defects

    Salary: Negotiable

    Resumes will be accepted until the position is filled.

    To Apply for this Position: Respond to this job posting and attach an updated resume.

    GliaCell Technologies, LLC is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

    Powered by JazzHR

    AfSvsFvCYY