Description:
As a Systems Engineer on the Mach1 Ops team, you will be at the forefront of delivering some truly valuable machine learning insights and analytics solutions. We are looking for someone who has a passion for supporting and enabling our customers (Data Scientists, Data Engineers and Software Developers) create solutions. That is helping them find ways to reduce the toil of data science development, evolving the automation of maintaining our systems, and building new and more effective monitoring capabilities. In this role, you will work closely with multiple organizations (Business Customers, Product Owners, Global Data and Analytics and IT) to convert business goals into tangible solutions, prioritizing them through Agile user stories on our product backlog.
Skills Required:
Strong communication and customer relationship skills Strong Skills in Python, Kubernetes, Hadoop, Docker and Linux Passion for teaching and enabling Data Science professionals to understand and apply your technical knowledge Strong organizational skills and experience managing activities across multiple diverse teams Knowledge of Predictive Analytics and/or Machine Learning concepts and applying them to Manufacturing, Finance, Quality Control and Marketing and Sales situations Knowledge of Monitoring tools and techniques (DynaTrace, ELK Stack, developing synthetic monitoring)
Skills Preferred:
*submit candidates in Talent Match Only
Experience Required:
Strong communication and customer relationship skills Strong Skills in Python, Kubernetes, Hadoop, Docker and Linux Passion for teaching and enabling Data Science professionals to understand and apply your technical knowledge Strong organizational skills and experience managing activities across multiple diverse teams Knowledge of Predictive Analytics and/or Machine Learning concepts and applying them to Manufacturing, Finance, Quality Control and Marketing and Sales situations Knowledge of Monitoring tools and techniques (DynaTrace, ELK Stack, developing synthetic monitoring)
Education Required:
Bachelor "s or related experience
Job Types: Full-time, Contract
Pay: $44.00 - $68.00 per hour
Schedule:
Contract Length:
Work Location:
Work Remotely:
COVID-19 Precaution(s):
Speak with the employer
+91 248-239-8004
Software Engineer (.NET – Azure)
Contract (2 years+)
Fully Remote
As a Software Engineer, you will be building and enhancing a new exciting customer loyalty platform to help drive additional consumer sales, customer stickiness, and loyalty within their stores. This mobile application will leverage cutting edge technology to improve processing speed of Big Data thus improving the overall customer experience of using the mobile application. Technologies and processes required to build the solution include; .NET framework, C#, SQL, use of APIM through Azure, Conceptualization of Big Data architecture, deployment through DevOps methodology, and code builds through Agile sprints. If you are interested in working in a booming industry with long-term investment in building modern solutions to improve the customer journey, please apply today!
Requirements:
Responsibilities
Brooksource provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws
We are searching for a Senior Software Developer with Linux and Cloud experience to support the National Center for Biotechnology Information (NCBI) is part of the National Library of Medicine (NLM) at the National Institutes of Health (NIH) in North Bethesda, MD. NCBI is the world’s premier biomedical center hosting over six million daily users that seek research, clinical, genetic, and other information that directly impacts biomedical research and public health – at NCBI you can literally help to accelerate cures for diseases! NCBI’s wide range of applications, platforms (Node, Python, Django, C++, you name it) and environments (big data [petabytes], machine learning, multiple clouds) serve more users than almost any other US Government Agency according to https://analytics.usa.gov/.
This is a full-time position with full benefits. You will start out working remotely until we beat Covid. We are looking for those IT professionals that enjoy having a passion for what they do and want to work for an organization that gives back to the community. That's what a position at the National Institutes of Health (NIH) will give you is an opportunity to contribute to the betterment of America's public health system. Be on the cutting edge as discoveries are made to improve your countries health!
The Senior Software Developer with Linux and Cloud platform experience will work on solutions to support continued development of NCBI’s dbGaP database. This database is the world premier archive of assembled and annotated sequence data and is a part of international collaboration that includes archives in Europe and Japan. SeqDB makes biological sequence data submitted by the scientific community available to researchers worldwide, enhancing reproducibility and allowing for new discoveries. SeqDB is a large resource, comprised of over 1.6 billion records and 6 trillion DNA basepairs, and handles requests at rates of up to 50,000/second. The future development of SeqDB will involve re-architecting of the backend sequence databases, including exploration of Cloud-based strategies for sequence access. NCBI - National Center for Biotechnology Information is part of the National Library of Medicine (NLM) at National Institutes of Health (NIH). NCBI serves over 4 million daily users in search of clinical, genetic and other information that directly impacts biomedical research and public health and is the world's top 3 most-visited site in the science category.
The Senior Software Developer with Linux and Cloud experience will be responsible for the development, implementation, testing, and continued maintenance of NCBI's dbGaP database and bioinformatics pipelines
Qualifications:
Desired qualifications:
#LI-LM1
We are a trusted government partner that blends deep domain expertise with advanced technologies to help our customers solve complex problems that improve, protect, and save lives. As a rapidly growing company, we combine entrepreneurial spirit, customer focus, and an outcomes-based approach to support agency missions in health IT, life sciences, public safety, and grants management.
The Dovel Family of Companies offers employees an opportunity to advance beyond a specific role or contract, we offer a path to develop an enriching career. We believe in empowering a culture of innovation, customer success, and employee growth.
What you’ll get…
We are an Equal Opportunity Employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, gender identity, disability, or veteran status.
Job Title: Application Development - Data Analyst (SAS &
Python )
Location: Charlotte, NC
Duration: 3 Months Contract Role
General Information
Job Description: 100% remote; Local candidates only; resource will pick up equipment onsite
Shift Start/End: 9am - 5pm ET
US Bank is looking for a successful Data Quality & Testing Analyst to support our Enterprise Data Governance (EDG) initiatives focused on our team's capabilities around data quality, data analytics and end-to-end and user acceptance testing. This role will collaborate with the business line and technology groups.
Primary responsibilities include
:• Effectively work in matrixed team of Data Quality Analysts/Data Scientists in support of the development, end to end testing and delivery of the EDG's capabilities to address a specific business and/or regulatory need.
• Research and build a detailed understanding of the problem and related data assets in order to code the data processing and analysis.
• Light development coding
• Ensure that the data used follows the compliance, access management and control policies of the company while meeting the data governance requirements of the business.
• Work with technical groups to support the collection, integration and retention of the data sources.
• Apply data visualization and summarization techniques to the analytical results.
• Interpret and communicate the results in a manner that is understood by the business.
Basic Qualifications
:• Bachelor's or Master’s Degree in Mathematics, Engineering, Computer Science or equivalent work experience.
• 5+ years related technical experience and 3+ years analytical experience in the Banking/Financial industry or similar highly regulated industry.
• SQL, end to end Testing and Python experience
Preferred Skills/Experience
• Proficiency with SQL, SAS and Python with focus on data analysis packages like pandas
• Experience participating in end-to-end and User acceptance testing and/ or Post Production validation of the data
• Knowledge of Spark and relational databases
• Experience with CI/CD tools like Jenkins
• Knowledge of public cloud platforms.
• Intermediate knowledge of application coding and development lifecycle.
• Analytical: (e.g. comfortable with data) and can combine an understanding of business objectives, customer needs, and data required to deliver customer experience and business results.
• Intermediate knowledge of data governance, data management, data architecture, data modelling concepts and data governance tooling and the complexities of data in a large financial and/or highly regulated institution.
• Willingness to continuously develop and acquire new technical skills, learn new tools & programming languages and Big Data techniques.
• Proven ability to adjust quickly to shifting priorities, multiple demands, ambiguity and rapid change.
��� Effective oral and written communication with the ability for analyzing, slicing and dicing data while deriving significant insights.
• Natural curiosity and self-directed ability to seek out information and meet goals/deadlines.
• Agile experience: Very comfortable following Agile Scrum methodology.
Analysis, and on-going support of internally developed processes to support business initiatives. Automation of processes using existing tool set, including Microsoft BI Stack (SSRS, SSIS, SSAS, Excel Services, Performance Point, and PowerPivot). Analysis and query writing to assist in supporting key business initiatives.
COMMUNICATION SKILLS
: This position requires good oral and written communication skills to interface with internal business clients - many having minimal technical knowledge. Ability to translate business requests into technical requirements.
TECHNICAL SKILLS
: Ability to write transact-sql code to extract correct data to answer business questions. SQL Development experience. Extensive experience with creating reports using Microsoft Sql Server Reporting Services.
ADDITIONAL BENEFICIAL EXPERIENCE
: use and construction of Microsoft Sql Server Analysis Services; Sharepoint (using the BI Stack). Strong attention to detail for validation of all queries constructed. Ability to visualize data into a form that effectively answers the business question.
Preference for candidates with SQL, Python & testing experience located in Charlotte, NC. Resource will pick up equipment onsite.
Real Staffing is currently working on 2 Java Developer roles for a leading Medical Device firm based in Pittsburgh, PA. The role is fully remote - working Eastern Standard hours.
They are looking for experienced Java Developers to work on a next generation data analytics platform for their advance sleep monitoring devices. The role is project critical for the firm and the team is looking to fill the opening by end of February.
Essential Duties and Responsibilities:
Development experience writing code to create single-threaded, multi-threaded or user interface event driven applications, either stand-alone and those which access servers or service is a preferred.
These roles are open to contract to hire or long term contracting. Paying market leading rates (depending on experience)
If you are interested, please apply with your latest resume and our team will give you a call!
Thanks,
Sthree US is acting as an Employment Business in relation to this vacancy.This is a fantastic opportunity to work as a Senior Solutions Architect on a remote contract, outside IR35 and working on a high profile project.
The key skills required for this Senior Solutions Architect position are:
If you do have the relevant experience for this Senior Solution Architect role, please do apply.
2 Contract Scala Developers wanted. £600 a day (Inside). 6 months rolling. Remote
We're looking for 2 Contract Scala Developers for a leading media client in West London paying £600 a day (Inside) on a rolling 6 month contract. Our client are expanding a team of Scala developers working on personalisation and API projects. If interested, please see more details below:
Location: Fully remote but when things return to normal it will be a couple of days in the office a week (West London) but some flex on this.
Project: Customisation/APIs
Experience required:
Desirable:
Interview process: 1 stage 1.5 hour Technical Q&A (no pairing exercise or take home test!)
If interested, please get in touch with your CV
Thanks
Julian
Xpertise Recruitment
Apex Systems the 2nd largest IT Staffing firm in the nation is seeking an experienced DevOps Engineer to join our client’s team. This is a W2 contract position is slated for 6 months with possibility for extension/conversion and is FULLY REMOTE (PST hours).
**Must be comfortable sitting on Apex System's W2**
If you are interested send all qualified resumes to Nathan Castillo (Professional Recruiter with Apex Systems) at Ncastillo@apexsystems.com!
Job Description:
We are on a mission to connect every member of the global workforce with economic opportunity, and that starts right here. Talent is our number one priority, and we make sure to apply that philosophy both to our customers and to our own employees as well. Explore cutting-edge technology and flex your creativity. Work and learn from the best. Push your skills higher. Tackle big problems. Innovate. Create. Write code that makes a difference in professionals’ lives.
Gobblin is a distributed data integration framework that was born at client and was later released as an open-source project under the Apache foundation. Gobblin is a critical component in client's data ecosystem, and is the main bridge between the different data platforms, allowing efficient data movement between our AI, analytics, and member-facing services. Gobblin utilizes and integrates with the latest open source big data technologies, including Hadoop, Spark, Presto, Iceberg, Pinot, ORC, Avro, and Kubernetes. Gobblin is a key piece in client's data lake, operating at a massive scale of hundreds of petabytes.
Our latest work involves integrations with cutting edge technologies such as Apache Iceberg to allow near-real-time ingestion of data from various sources onto our persistent datasets that allow complex and highly scalable query processing for various business logic applications, serving machine-learning and data-science engineers. Furthermore, we play an instrumental role in client's transformation from on-prem oriented deployment to Azure cloud-based environments. This transformation prompted a massive modernization and rebuilding efforts of Gobblin, transforming it from a managed set of Hadoop batch jobs to an agile, auto-scalable, real-time streaming oriented PaaS, with user-friendly self-management capabilities that will boost productivity across our customers. This is an exciting opportunity to take part in shaping the next generation of the platform.
What is the Job
You will be working closely with development and site reliability teams to better understand their challenges in aspects like:
Increasing development velocity of data management pipelines by automating testing and deployment processes,
Improving the quality of data management software without compromising agility.
You will create and maintain fully-automated CI/CD processes across multiple environments and make them reproducible, measurable, and controllable for data pipelines that deal with PBs every day. With your abundant skills as a DevOps engineer, you will also be able to influence the broad teams and cultivate DevOps culture across the organization.
Why it matters
CI/CD for big data management pipelines have been a traditional challenge for the industry. This is becoming more critical as we evolve our tech stack into the cloud age (Azure). With infrastructure shifts and data lake features being developed/deployed at an ever fast pace, our integration and deployment processes must evolve to ensure the highest-quality and fulfill customer commitments. The reliability of our software greatly influences the analytical workload and decision-making processes across many company-wide business units, the velocity of our delivery plays a critical role to transform the process of mining insights from massive-scale Data Lake into an easier and more efficient developer productivity paradigm.
What You’ll Be Doing
Technical Skills
Years of Experience
Preferred Skills
Apex Systems the 2nd largest IT Staffing firm in the nation is seeking an experienced DevOps Engineer to join our client’s team. This is a W2 contract position is slated for 6 months with possibility for extension/conversion and is FULLY REMOTE (PST hours).
**Must be comfortable sitting on Apex System's W2**
Job Description:
We are on a mission to connect every member of the global workforce with economic opportunity, and that starts right here. Talent is our number one priority, and we make sure to apply that philosophy both to our customers and to our own employees as well. Explore cutting-edge technology and flex your creativity. Work and learn from the best. Push your skills higher. Tackle big problems. Innovate. Create. Write code that makes a difference in professionals’ lives.
Gobblin is a distributed data integration framework that was born at client and was later released as an open-source project under the Apache foundation. Gobblin is a critical component in client's data ecosystem, and is the main bridge between the different data platforms, allowing efficient data movement between our AI, analytics, and member-facing services. Gobblin utilizes and integrates with the latest open source big data technologies, including Hadoop, Spark, Presto, Iceberg, Pinot, ORC, Avro, and Kubernetes. Gobblin is a key piece in client's data lake, operating at a massive scale of hundreds of petabytes.
Our latest work involves integrations with cutting edge technologies such as Apache Iceberg to allow near-real-time ingestion of data from various sources onto our persistent datasets that allow complex and highly scalable query processing for various business logic applications, serving machine-learning and data-science engineers. Furthermore, we play an instrumental role in client's transformation from on-prem oriented deployment to Azure cloud-based environments. This transformation prompted a massive modernization and rebuilding efforts of Gobblin, transforming it from a managed set of Hadoop batch jobs to an agile, auto-scalable, real-time streaming oriented PaaS, with user-friendly self-management capabilities that will boost productivity across our customers. This is an exciting opportunity to take part in shaping the next generation of the platform.
What is the Job
You will be working closely with development and site reliability teams to better understand their challenges in aspects like:
Increasing development velocity of data management pipelines by automating testing and deployment processes,
Improving the quality of data management software without compromising agility.
You will create and maintain fully-automated CI/CD processes across multiple environments and make them reproducible, measurable, and controllable for data pipelines that deal with PBs every day. With your abundant skills as a DevOps engineer, you will also be able to influence the broad teams and cultivate DevOps culture across the organization.
Why it matters
CI/CD for big data management pipelines have been a traditional challenge for the industry. This is becoming more critical as we evolve our tech stack into the cloud age (Azure). With infrastructure shifts and data lake features being developed/deployed at an ever fast pace, our integration and deployment processes must evolve to ensure the highest-quality and fulfill customer commitments. The reliability of our software greatly influences the analytical workload and decision-making processes across many company-wide business units, the velocity of our delivery plays a critical role to transform the process of mining insights from massive-scale Data Lake into an easier and more efficient developer productivity paradigm.
What You’ll Be Doing
Technical Skills
Years of Experience
Preferred Skills
PLEASE NOTE: THIS IS A W2 CONTRACT; NO 3RD PARTY RECRUITER SUBMITTALS PLEASE. THANK YOU IN ADVANCE FOR HONORING THE REQUIREMENTS.
Job Order: 179891
Functional Title: Full Stack Software Engineer - 179891
Client Title: Full Stack Software Engineer
Location: Greenville, SC, and Remote (See information below)
PAY: $42.00 to $52.00 PER HOUR
1st Shift: 40 hours per week-8 AM until 5 PM, Monday through Friday
Contract: Long term
Interview Type: Onsite or Video
Hiring Bonus: $2000.00
The Hiring Bonus is broken down as follows:
You will receive $500 on the 1st paycheck
$500 at 60 days
$500 at 90 days
$500 at 180 days.
Interview Type: Video or In-Person (No travel expense offered by the client for an in-person interview. No relocation offered)
90% REMOTE WORK. THE 1ST 2 TO 3 WEEKS WILL BE ONSITE DURING TRAINING. AFTER TRAINING, YOU WOULD BE REQUIRED TO REPORT TO THE SITE 1 DAY EVERY 2 WEEKS. IF THERE IS AN EMERGENCY, YOU WOULD NEED TO BE ABLE TO ARRIVE AT THE SITE WITHIN 2 HOURS.
The position is an ongoing W2 contract. There is no end date projected.
The position would be working for HKA at the client site Greenville, SC facility.
HKA offers employee paid medical and 401k on the 1st of the month after your 1st day.
40 hours of prorated vacation after six months.
Six paid holidays after 30 days..FULL STACK SOFTWARE ENGINEER
Required Skills:
Frontend Technology: 1 of the 3 Frontend Technologies is required
React=Front-end library that runs on a browser
Angular=Angular is a platform and framework for building single-page client applications using HTML and TypeScript. Angular is written in TypeScript. It implements core and optional functionality as a set of TypeScript libraries that you import into your apps.
Vue=Vue. js is a robust but simple JavaScript framework. It has one of the lowest barriers to entry of any modern framework while providing all the required features for high-performance web applications; a frontend web app and backend REST API server.
Script Experience: 1 of the 4 scripting experiences is required.
Node JS=Node.js is a server-side platform built on Google Chrome's JavaScript Engine-V8 Engine.
TypesScript=Used to develop JavaScript applications for both client-side and server-side execution-as with Node. js or Deno.
Java=Java is used as the server-side language for most back-end development projects, including those involving big data.
.Net=Software development framework and ecosystem designed and supported by Microsoft to allow for easy desktop and web application engineering. It's a popular free platform currently used for many different types of applications as it provides the programming environment for most software development phases.
Backend Development: 1 of the 4 Backend technology experiences is required.
REST APIs=A RESTful API is an architectural style for an application program interface (API) that uses HTTP requests to access and use data. That data can be used to GET, PUT, POST, and DELETE data types, which refers to the reading, updating, creating, and deleting operations concerning resources.
Event-Driven Services= With an event-driven system, the capture, communication, processing, and persistence of events is the solution's core structure.
Scheduled Services=Scheduled services are activities that are executed periodically. The schedule for each activity is set using the Task Scheduler within the Windows operating system.
Serverless Functions=Serverless functions are event-driven, meaning the code is invoked only when triggered by request.
Cloud Hosting Solutions: 1 of the 3 cloud hosting solutions is required.
AWS=is a secure cloud services platform, offering computing power, database storage, content delivery, and other functionality to help businesses scale and grow. Running web and application servers in the cloud to host dynamic websites.
Google=is a provider of computing resources for deploying and operating applications on the web. Its specialty is providing a place for individuals and enterprises to build and run the software, and it uses the web to connect to the users of that software.
Azure=is a serverless solution that allows you to write less code, maintain less infrastructure, and save on costs. Instead of worrying about deploying and maintaining servers, the cloud infrastructure provides all the up-to-date resources needed to keep your applications running.
Database Engines: 1 of the 3 DBA Engines is required.
PostgreSQL= stable database management system used as the primary data store or data warehouse for many web, mobile, geospatial, and analytics applications.
SQLServer= is proprietary software or an RDBMS tool that executes the SQL statements.
Mongo= can coordinate and control changes to the structure of documents using schema validation; Classified as a NoSQL database program, MongoDB uses JSON-like documents with optional schemas.
Git: For tracking changes in any set of files, usually used for coordinating work among programmers collaboratively developing source code during software development. Its goals include speed, data integrity, and support for distributed, non-linear workflows.
Bonus Skills:
Experience with CI/CD pipelines and Data Science
Strong verbal + written communication skills
Demonstrates ability to work well in team-oriented settings
Communication skills (Verbal and Written)
Position Scope:
Designs and develops overall architecture for web application using leading-edge technologies and know-how from Industry, partner companies, local universities, and internal partners into automotive concepts. Integrates these concepts using project development methods and Integration processes. Identifies emerging technologies to build software and/or hardware prototypes and production-ready solutions. Focuses on areas including but not limited to web, DB, ETL, and other enabling technologies to develop and maintain quality, responsive applications.
Position Responsibilities:
Designs and develops web applications.
Defines and documents functionality through use cases, business process, flows, UI design, and UML modeling as necessary.
Works on several development initiatives concurrently and provides subject matter expertise on customer implementations and product customization.
Partners with other developers to develop functionality following existing style and coding standards.
Reviews designs, demo prototypes and provides application support.
Defines the visualization and realization of future technologies.
Supports the complete process from development of concepts and vision to full production-ready solutions, which can be integrated rapidly into the automotive environment.
Serves as a primary point of contact for other engineers and specialists in the team to provide expert knowledge and troubleshooting skills.
Serves as an internal consultant to other developers and engineers as needed, assisting in all phases of product life-cycle development.
Maintains accurate, meaningful, and updated technical and non-technical documentation of all aspects of area(s) of responsibility.
Analyzes business-critical processes, evaluates and recommends improvements.
Measures performance of delivered services through a set of agreed metrics and ensures appropriate actions are taken to meet all services agreements.
Performs other duties as assigned by the Operations Supervisor.
Position Competencies:
Education:
BA/BS Degree in Business, Computer Science or Electrical Engineering preferred or the equivalent of 4 years professional IT-related experience.
MS degree (preferred).
Experience:
5+ years of technical experience in Information Technology.
2+ years of experience in web and DB development with demonstrated strengths in software and hardware design, development, integration, and testing.
5+ years of experience with Object-oriented programming (Java, Objective C, or JavaScript).
3+ years of experience implementing software design patterns and best practices applicable to web development.
1+ years of experience developing enterprise or client-facing web applications.
1+ years of experience deploying and supporting Web applications.
1+ years of hands-on development experience with back-end programming languages (PHP, Python, Java, .NET, JavaScript, etc.).
1+ years of experience with MySQL, MongoDB, Oracle.
1+ years of experience interfacing with REST Web Services and JSON/XML.
1+ years of experience monitoring and supporting performance of web applications and infrastructure.
Basic knowledge of Linux Server administration, Version Control Systems.
Licenses and/or Certifications:
Process/project management experience or training/certifications (preferred but not required).
Primary Work Location/Shift:
90% REMOTE WORK. THE 1ST 2 TO 3 WEEKS WILL BE ONSITE DURING TRAINING. AFTER TRAINING, YOU WOULD BE REQUIRED TO REPORT TO THE SITE 1 DAY EVERY 2 WEEKS. IF THERE IS AN EMERGENCY, YOU WOULD NEED TO BE ABLE TO ARRIVE AT THE SITE WITHIN 2 HOURS.
Data Architect with cloud ideally GCP and PySpark experience is required for 6-month contract with a leading financial services organisation based in London. You will architect, design, estimate, developing and deploy cutting edge software products and services that leverage large scale data ingestion, processing, storage and querying, in-stream & batch analytics for Cloud and on-prem environments.
THIS ROLE IS FULLY REMOTE AND INSIDE IR35
Experience:
Desirable: