Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.
Remote
0
USD
/hr

16 remote Hadoop contracts

Senior Hadoop Developer w Spark/Scala/ETL (Remote)

1 month ago
Remotecloudteam
A Hadoop developer is responsible for the design, development and operations of systems that store and manage large amounts of data.
Get new remote Hadoop contracts sent to you every week.
Subscribed to weekly Hadoop alerts! 🎉

Software Engineer

14 days ago
RemoteMondo
Role: Software Engineer
Start Date: 09/07 - latest
Location: Open to fully remote
Rate: DOE
Years Exp: 5+ years

Summary:

The main function of a software engineer is to design, develop, implement, test, and maintain business and computer applications software or specialized utility programs including mainframe and client/server applications, and major enhancement of existing systems.

Job Responsibilities: Fine-tune and improve a variety of sophisticated software implementation projects. Gather and analyze system requirements, document specifications, and develop software solutions to meet client needs and data. Analyze and review enhancement requests and specifications. Implement system software and customize to client requirements. Prepare the detailed software specifications and test plans. Code new programs to client's specifications and create test data for testing. Modify existing programs to new standards and conduct unit testing of developed programs. Create migration packages for system testing, user testing, and implementation. Provide quality assurance reviews. Perform post-implementation validation of software and resolve any bugs found during testing.

Education/Experience: Bachelor's degree in a technical field such as computer science, computer engineering or related field required. A solid foundation in computer science, with strong competencies in data structures, algorithms, and software design.


Required Skills:

  • Experience with Java development.

  • Experience with TIBCO Business Works and Business Events development.

  • Knowledge and previous hands-on experience with relational databases such as Oracle.

  • Experience with AWS cloud development is a plus.

  • Proficient for the open source Java technologies like Spring, HBase, Dropwizard, and Solr is a plus.

  • Experience building service-oriented solutions and RESTful microservices is a plus.

  • Experience with Hadoop based structures, theories, principles, and practices, data loading tools like Flume, and workflow schedulers like Oozie is a plus.

  • Knowledge in agile methodologies and software development lifecycle.

  • Experience in healthcare operations and associated financial transactions such as claims, remittance, eligibility, claim status and authorizations is a plus.

  • Ability to independently complete projects in a timely manner with a high degree of technical accuracy.

  • Ability to prepare, interpret and communicate complex technical information.

  • Possess excellent troubleshooting, code debugging and problem-solving skills.

Java Developer

2 days ago
£630/dayGreat rateRemoteInside IR35Huxley Associates London

My client within Investment banking is looking for a Java Developer with experience working on end to end application development within Front office. The role will be working on the development of new applications used in a trading environment.

Tech Requirements

  • Core Java
  • Spring
  • Front office experience
  • Kafka
  • Cassandra, MySQL, MongoDB
  • Hazelcast, Snowflake, Hadoop


The client is paying up to £630p/d (inside IR35) on a 6 month rolling contract, initially the role will be fully remote due to Covid 19 restrictions.

Return to the London office will be inline with government advice.

If this role is of interest please apply below.

Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement...... click apply for full job details

Java Developer

3 days ago
£630/dayGreat rateRemoteInside IR35Huxley Associates London

My client within Investment banking is looking for a Java Developer with experience working on end to end application development within Front office. The role will be working on the development of new applications used in a trading environment.

Tech Requirements

  • Core Java
  • Spring
  • Front office experience
  • Kafka
  • Cassandra, MySQL, MongoDB
  • Hazelcast, Snowflake, Hadoop


The client is paying up to £630p/d (inside IR35) on a 6 month rolling contract, initially the role will be fully remote due to Covid 19 restrictions.

Return to the London office will be inline with government advice.

If this role is of interest please apply below.

Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.

To find out more about Huxley, please visit www.huxley...... click apply for full job details

Big Data Engineer

7 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    London

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Job functions:

    Data Scientist

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1596619463

  • Published:

    7 days ago

  • Duration:

    3 Months

  • Expiry date:

    2020-09-04

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

Based in either London, Hampshire or Newport

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE. Please do not apply if you do not have active clearance as we can not accept applications.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python
  • Scala
  • Spark
  • Hadoop (Hive, Impala)
  • Extensive knowledge of Data Engineering and Architecture best practices
  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

9 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    London

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Job functions:

    Data Scientist

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1596462925

  • Published:

    9 days ago

  • Duration:

    3 Months

  • Expiry date:

    2020-09-02

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

Based in either London, Hampshire or Newport

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE. Please do not apply if you do not have active clearance as we can not accept applications.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python
  • Scala
  • Spark
  • Hadoop (Hive, Impala)
  • Extensive knowledge of Data Engineering and Architecture best practices
  • Stakeholder Management

Please apply online for more details.

Scala Developer Remote

2 months ago
$60/hourRemoteXyant Services

We are looking for Java Developers with Scala.

Location: Bentonville, AR. Remote

- Candidate should have real time experience in scala recent projects.

- Should have at least two experience in scala API development not in Hadoop

- Scala with API Microservices Development

Requirements:

  • Min three years of hands on experience in projects in Scala & Kafka
  • Must have worked on Play / AKKA framework
  • Should have worked in Azure or Google or AWS environment
  • Five plus years of Java Development experience (Java 8 functional programming is must)
  • Expertise in tool kits Akka, SBT Parser: Lift JSON
  • Server-side experience in JDBC, JSP, SAX/DOM, Web Services, SOAP, WSDL, UDDI, JAXB
  • Must be able to write complex SQL statements
  • Should be able to demonstrate experience in

- SCM: Git, SVN, ClearCase Good understanding of JVM

- Build: NAnt, sbt, FMake, NuGet, gulp

- Application Containers: Apache, Tomcat, Jetty

- Web: Web Logic 5.x/6.x, Websphere 3.5/4, Play Framework, Spray

  • Experience in NoSQL solution such as MongoDb, or Cassandra will be a plus

Job Type: Contract

Salary: $60.00 /hour

ETL Developer

13 days ago
$45 - $50/hourRemoteTavant

Role: ETL -5 years’ experience

Location: Columbia SC (100% remote)

W2 only: Cannot work on C2C

Primary Skill : DW & BI

Technical Competencies:

  • Apache Hadoop, Sqoop, Hive SQL (Required) [3+ years of Exp]
  • Oracle Business Intelligence Enterprise Edition 11g (1+ year of Exp)

Major Accountabilities:

  • Develop ETL and database components independently following specifications, standards, and global best practices
  • Work with project teams provide input and assistance in the design, development, and support of ETL and database objects
  • Develop ETL and database components following standard software development lifecycle.
  • Effectively communicate issues and manage the issues to resolution
  • Develop front end OBI reports

Objectives:

  • Work as part of global development team developing ETLs and database components
  • Follow standard Software development Lifecycle practices in development of ETL and database objects
  • Provide quick and accurate error resolution

Technical Competencies:

  • Apache Hadoop, Sqoop, Hive SQL (Required) [3+ years of Exp]
  • Oracle Business Intelligence Enterprise Edition 11g (1+ year of Exp)

Additional Skills:

  • Understanding on Business Intelligence Dimensional Modelling, Star Schemas, Slowly Changing Dimensions
  • Understanding of ETL process and development
  • Excellent communication skills
  • Ability to multi-task and work in fast-paced environment, and manage multiple priorities.
  • Ability to learn quickly and retain knowledge
  • Ability to work independently with a strong attention to detail
  • Ability to work well in a global team environment

Requirements:

Requires Bachelor’s degree or foreign equivalent in Computer Science, Information Technology, Computer Engineering, Computer Information Systems, Computer Applications, Math (any), Science (any), Engineering (any), Technology (any), or a related field.

Headquartered in Santa Clara, California, Tavant is a digital products and platforms company that provides impactful results to its customers across North America, Europe, and Asia-Pacific. Founded in 2000, the company employs over 2000+ people and is a recognized top employer. Tavant is creating an AI-powered intelligent lending enterprise by reimagining customer experiences, driving operational efficiencies, and improving collaboration.

It prides itself on its traditions of engineering and process excellence coupled with high employee and customer satisfaction levels. Its unique best-shore delivery model provides close onsite interaction with customers and a strong process-oriented offshore team. At all levels, its employees continually interact to provide a superior outsourcing experience to customers. Tavant s suite of products and services are routinely rated high by the industry & media and deployed by leading business names, like Disney Streaming Services, CNBC, Grubhub, Experian, TiVo, MLB AM, New York Times, Ingersoll Rand, Land O'Lakes and many more. Tavant is an ISO 27001 & SAS 70 Type II compliant, and SEI CMM Level 4 compliant organization.

Find Tavant on LinkedIn and Twitter or visit us at www.tavant.com

Job Type: Contract

Pay: $45.00 - $50.00 per hour

Schedule:

  • 8 Hour Shift
  • Monday to Friday

Experience:

  • ETL: 3 years (Required)

Contract Renewal:

  • Possible

Full Time Opportunity:

  • No

Work Location:

  • Fully Remote

Visa Sponsorship Potentially Available:

  • No: Not providing sponsorship for this job

Genesys/Avaya Data Consultant - 100% Remote

1 month ago
$45 - $50/hourRemoteeTek IT Service | Savvysol

Contract on W2

Top requirements

  • Genesys Info Mart – do not send without
  • SQL data base so should have SQL, SSIS and SSRS
  • Strong communication

Nice to have

  • Data lake exp; they have a Hadoop environment
  • Kafka is how they are consuming
  • Prior avaya to genesis migration exp – this would be a slam dunk
  • IAX workflow mgmt.

IDEAL BACKGROUND: Deep Avaya CMS and Genesys Info Mart background, Kafka development, Sparq, Dremio, Hadoop, Java, and strong SQL development skills.

Position will be responsible to analyze, design, configure, implement and support Genesys reporting in GI2, Infomart, and Pulse.

Tools/skills needed:
Genesys, Avaya, SSIS, SSRS, ETL Development, Kafka development

Job Types: Full-time, Contract

Salary: $45.00 - $50.00 per hour

Experience:

  • SQL data base: 4 years (Preferred)
  • Avaya: 2 years (Preferred)
  • Genesys Info Mart : 3 years (Preferred)
  • Kafka: 2 years (Preferred)

Application Question:

  • Are you ok with w2?

Work Remotely:

  • Yes

Senior Data Infrastructure Engineer

2 days ago
£84.09 - £88.3/hourRemoteHarnham

Senior Data Infrastructure Engineer
Stockholm, Sweden
6-month Contract
1000 SEK per hour

As a Senior Data Infrastructure Engineer, you will be responsible for stabilising a Hadoop cluster using Ansible, Terraform and Kubernetes.

THE COMPANY:
This company are globally established gaming and betting firm who have seen a steady increase in activity since sports has started to return. In order to store and secure customer data, the big data team need to ensure their infrastructure is secure enough to house a Hadoop cluster. You will be automating the development process, introducing CI/CD to the team.

THE ROLE:
As a Senior Data Infrastructure Engineer, you are required to help advise on how to architect and deploy a big data solution to assist with the vast transformation project that the company are doing. If you have experience working with the Hadoop eco-system, this will be the perfect role for you. As for the DevOps side of the role, you will be building CI/CD pipelines in Python. You will also get the chance to install Docker containers as well as working with Kubernetes and Ansible.

YOUR SKILLS AND EXPERIENCE:
The ideal Senior Data Infrastructure Engineer will have:

  • Experience working with the Hadoop eco-system
  • Expertise with Ansible, Docker and Jenkins
  • Reviewed production level code in Java or Python
  • Built CI/CD pipelines

HOW TO APPLY:
Please submit your CV to Henry Rodrigues at Harnham via the Apply Now button.
Please note that our client is currently running a fully remote interview process, and able to on-board and hire remotely as well.

PDI Developer

8 days ago
£490 - £600/day (Estimated)RemoteInside IR35Whitehall Resources Ltd

PDI Developer

Whitehall Resources are looking for an experienced PDI Developer for an initial 3 month contract.

This role will be remote working initially, and Manchester based when movement restrictions are lifted.

This role has been deemed INSIDE IR35 and will require the use of an umbrella company if successful.

Tasks:
- Ability to work within defined standards and job frameworks.
- Ensure clear understanding of requirements
- Work with Architects and Lead Developers to gain high level understanding of solution architecture
- Should actively participate in stand-ups and sprint meetings
- Experience in troubleshooting Pentaho Data Integrator server including platform and Tools issues
- Responsible for unit testing their own work and peer reviews where required to ensure accurate completion of development task
- Familiar with GIT source code repository for code version management and branching
- Experience with using PDI with relational databases

Technologies:
- PDI
- AWS (S3)
- General ETL knowledge
- Cloudera
- Apache, Hadoop, hive, Impala, hdfs etc.
- Berlin sos Jobscheduler
- Vault
- Jenkins
- Ansible
- General Scripting
- Experience working with Cloudera Hadoop platforms (eg EDH)
- Knowledge of the Data Acquisition Ingestion Pipeline (at least good awareness and understanding of the stages the data goes through, so able to pick up and understand how the spreadsheets work)
- Good knowledge of Pentaho Data Integrator development skills
- They must have Pentaho experience
- Experience with using PDI with relational databases.
- Oracle and MySQL desirable. Familiar with GIT source code repository for code version management and branching.
- Operational support of system components
- Software configuration management/Version control
- Software release management/Release management of service improvements

All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description.


Role: PDI Developer
Job Type: Contract
Location: Cheshire,

Apply for this job now.

PDI Developer

8 days ago
£490 - £600/day (Estimated)RemoteInside IR35Whitehall Resources Ltd

PDI Developer

Whitehall Resources are looking for an experienced PDI Developer for an initial 3 month contract.

This role will be remote working initially, and Manchester based when movement restrictions are lifted.

This role has been deemed INSIDE IR35 and will require the use of an umbrella company if successful.

Tasks:
- Ability to work within defined standards and job frameworks.
- Ensure clear understanding of requirements
- Work with Architects and Lead Developers to gain high level understanding of solution architecture
- Should actively participate in stand-ups and sprint meetings
- Experience in troubleshooting Pentaho Data Integrator server including platform and Tools issues
- Responsible for unit testing their own work and peer reviews where required to ensure accurate completion of development task
- Familiar with GIT source code repository for code version management and branching
- Experience with using PDI with relational databases

Technologies:
- PDI
- AWS (S3)
- General ETL knowledge
- Cloudera
- Apache, Hadoop, hive, Impala, hdfs etc.
- Berlin sos Jobscheduler
- Vault
- Jenkins
- Ansible
- General Scripting
- Experience working with Cloudera Hadoop platforms (eg EDH)
- Knowledge of the Data Acquisition Ingestion Pipeline (at least good awareness and understanding of the stages the data goes through, so able to pick up and understand how the spreadsheets work)
- Good knowledge of Pentaho Data Integrator development skills
- They must have Pentaho experience
- Experience with using PDI with relational databases.
- Oracle and MySQL desirable. Familiar with GIT source code repository for code version management and branching.
- Operational support of system components
- Software configuration management/Version control
- Software release management/Release management of service improvements

All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description.


Role: PDI Developer
Job Type: Contract
Location: Cheshire,

Apply for this job now.

Senior Data Engineer

27 days ago
RemoteAARG

Consultants should have experience with traditional and modern technologies such as Apache Spark, NoSQL databases, Python, REST API, relational databases, Snowflake, PostgreSQL, Git, JavaScript, Shell scripting, and AWS.

In this role, consultants will be responsible for creating, maintaining and optimizing their data delivery and extraction from multiple data sources into their data warehouse.

Responsibilities of the role include:

  • Build data pipeline frameworks to automate high-volume and real-time data delivery to our cloud platform
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Develop and enhance applications using a modern technology stack such as Java, Python, Shell Scripting, Scala, Postgres, Angular JS, React, and Cloud based data warehousing services such as Snowflake
  • Perform unit tests and conducting reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
  • Bachelor’s Degree; nice to have Accounting / finance business knowledge is a plus
  • 5+ years of experience building data pipelines and using ETL tools to solve complex business problems in an Agile environment
  • 5+ years of experience in at least one scripting language (SQL, Python, Perl, JavaScript, Shell)
  • 3+ year of experience using relational database systems (Snowflake, PostgreSQL, or MySQL)
  • 3+ year experience working on streaming data applications (Spark Streaming, Kafka, Kinesis, and Flink)
  • 3+ years of experience in big data technologies (MapReduce, Cassandra, Accumulo, HBase, Spark, Hadoop, HDFS, AVRO, MongoDB, or Zookeeper)
  • 3+ years of experience with Amazon Web Services (AWS), Microsoft Azure or another public cloud service

Job Types: Temporary, Contract

Schedule:

  • Monday to Friday

Experience:

  • Data: 3 years (Preferred)

Work authorization:

  • United States (Preferred)

Contract Renewal:

  • Likely

Full Time Opportunity:

  • Yes

Work Location:

  • Fully Remote

Solution Architect with SFDC and azure, middleware backgroun...

19 days ago
RemoteCloud EPA, LLC

Position: Principal Solutions Architect - Azure Cloud
Location: Wilmington, DE
Duration: 12+ Months
Job Type: W2 Contract/C2C

Primary Responsibilities

  • Experience and passion for service ownership, building reliable/self-healing services.
  • Lead architecture roadmap for IAC (Infrastructure as Code) on Azure
  • Lead architecture and design for public cloud security services especially in Azure
  • Lead architecture roadmap for foundation security services
  • Security and privacy primitives including an understanding of cryptography concepts.
  • Follow trends in technology and apply new security approaches for device-identity, authentication, attestation, key-storage, and management.
  • Prototype new features that will enable and secure both real-time and non-realtime peer-to-peer communication over heterogeneous networks.

Qualifications

  • PhD or Masters in Computer Science, Electrical Engineering, or equivalent
  • 10+ years of hands-on experience working in a security-focused role in the technology or other technology-heavy industry.
  • 5+ years experience in development in public cloud environments especially Azure
  • Experience with IAC (Infrastructure as Code) on Azure
  • Experience with public cloud security services in Azure
  • Experience with container technologies (e.g., Docker, Kubernetes)
  • Experience with cloud computing architectures and the associated security designs and challenges
  • C#, Go, Java, Python, C/C++
  • Experience with PKI, X509, OWASP, PKCS#11, HSM
  • Knowledge working with relational database MySQL, Postgres
  • Familiar with open source technologies, such as ZooKeeper, MongoDB
  • Experience with big data and pipeline technologies, such as Hadoop, Kafka
  • Good knowledge with operating systems (Linux, Mac, and Windows)
  • Good knowledge with network technologies, such as TCP/IP, DNS or load balancer
  • Experience at Scrum or other agile development methodologies, with attention to code quality, delivering secure code.
  • Skilled in implementing secure modern Identity and Access Management (IAM)
  • In-depth knowledge of common application and infrastructure security vulnerabilities and mitigations
  • Experience implementing zero-trust security models
  • Strong cross-functional leadership and team-building skills
  • Experience engaging with customers regarding security
  • Excellent verbal and written communication skills

Job Type: Contract

Salary: $153,000.00 - $183,000.00 per year

Schedule:

  • Monday to Friday

Experience:

  • DevOps: 4 years (Required)

Contract Length:

  • 1 year
  • More than 1 year

Contract Renewal:

  • Likely

Full Time Opportunity:

  • Yes

Additional Compensation:

  • Commission

Work Location:

  • Fully Remote

This Job Is Ideal for Someone Who Is:

  • Dependable -- more reliable than spontaneous
  • People-oriented -- enjoys interacting with people and working on group projects
  • Adaptable/flexible -- enjoys doing work that requires frequent shifts in direction
  • Detail-oriented -- would rather focus on the details of work than the bigger picture
  • Achievement-oriented -- enjoys taking on challenges, even if they might fail
  • Autonomous/Independent -- enjoys working with little direction
  • Innovative -- prefers working in unconventional ways or on tasks that require creativity
  • High stress tolerance -- thrives in a high-pressure environment

Company's website:

  • cloudepa.com

Work Remotely:

  • Temporarily due to COVID-19

Senior Data Engineer

28 days ago
RemoteComputer Enterprises, Inc.

A Data Engineer II is responsible for coding and continuous testing of complex modules and applications in support of the client’s platform. This role will also be charged with understanding and interpreting requirements to contribute to the technical architecture and the associated design documents.

PRIMARY DUTIES AND RESPONSIBILITIES

  • Writing, debugging, unit testing, and performance test code in the data access layer in accordance with client’s standards.
  • As an agile team member, participate in code reviews, design reviews, etc.
  • Utilize domain driven techniques and design patterns to build and contribute to technical design.
  • Develop and maintain strong knowledge of implemented requirements and detailed application behaviors.
  • Assists in the development and training of SE I.

EDUCATION

  • Bachelor's computer information technology, computer science, management required
  • Master's preferred

EXPERIENCE

  • 5+ years of experience in a cloud computing environment.
  • Strong understanding and familiarity working in the Linux operating environment.
  • Familiarity and experience executing several software development methodologies and life cycles preferred.

SKILLS

  • 5+ years of developing software using object-oriented or functional language experience

5+ years of SQL

  • 3+ years working with open source Big Data technology stacks (Apache Nifi, Spark, Kafka, HBase, Hadoop/HDFS, Hive, Drill, Pig, etc.) or commercial open source Big Data technology stacks (Hortonworks, Cloudera, etc.)
  • 3+ years with document databases (e.g. MongoDB, Accumulo, etc.)

3+ years of experience using Agile development processes (e.g. developing and estimating user stories, sprint planning, sprint retrospectives, etc.)

  • 2+ years of distributed version control system (e.g. git)

3+ years of experience in cloud-based development and delivery

  • Familiarity with distributed computing patterns, techniques, and technologies (e.g. ESB)

Familiarity with continuous delivery technologies (e.g. Puppet, Chef, Ansible, Docker, Vagrant, etc.)

  • Familiarity with build automation and continuous integration tools (e.g. Maven, Jenkins, Bamboo, etc.)
  • Familiarity with Agile process management tools (e.g. Atlassian Jira)
  • Familiarity with test automation (Selenium, SoapUI, etc.)
  • Good software development and Object Oriented programming skills.
  • Strong analytical skills and the ability to work with end users to transform requests into robust solutions.
  • Excellent oral and written communication skills.

Initiative and self-motivation to work independently on projects

Job Type: Contract

Schedule:

  • Monday to Friday

Work Location:

  • Fully Remote

Azure Engineer - 100% Remote

1 month ago
$70 - $100/hourRemoteeTek IT Service | Savvysol

Contract on W2

Responsibilities:

  • Attend planning and scrum meetings with the DevOps team and Data Lake Product Dev teams.
  • Assess Azure Data Factory data ingestion pipelines and make recommendations for improvement.
  • Assess the Azure DevOps tools and configurations for Continuous Integration/Continuous Delivery.
  • Liaison only. Will not be coding or creating any data pipelines or doing any DevOps hands-on work.

Must haves:

  • Azure Data Factory (ADF)
  • Azure Data Lake Store (ADLS)
  • Azure DevOps (ADO) Jenkins like tool

Nice to haves:

  • Azure Data Bricks
  • Hadoop or Big Data experience
  • GitHub source control experience
  • Continuous Integration / Continuous Delivery (CI/CD)

Job Types: Full-time, Contract

Salary: $70.00 - $100.00 per hour

Experience:

  • Azure DevOps: 5 years (Preferred)
  • Hadoop or Big Data: 4 years (Preferred)
  • Azure Data Lake Store: 4 years (Preferred)
  • Azure Data Factory: 4 years (Preferred)
  • Azure: 7 years (Preferred)

Application Question:

  • Are you comfortable to W2?

Work Remotely:

  • Yes