Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.
Remote
0
USD
/hr

20 remote Hadoop contracts

Hadoop Engineer ( 100% Remote)

15 days ago
RemoteGimmko Technologies

Hadoop Engineer
Location: 100% Remote

Duration: 12-18+ months+

VISA: Citizen, GC, GC-EAD, H4,
Job Description

  • Apache Hadoop and Cloudera distribution a must
  • Has to be a hard core engineer. No developers, architects, or admins. Has to be and engineer who can “fix” and push back on the developers.

Analyzes, designs, creates and implements Big Data infrastructures, including access methods, device allocations, validation checks, organization and security. Designs data models, logical and physical infrastructure designs, etc. Assists in system planning, scheduling, and implementation. Initiates corrective actions to stay on schedule. Installs, upgrades, and tests complex big data deployments. Develops and implements recovery plans and procedures. Disciplines: Hadoop design and analysis.

Involved in the analysis, design, development and implementation of software applications. Determines user requirements, leads application design, plans projects, establishes priorities and monitors progress.

Skills:

  • Solid administrative knowledge of Apache Hadoop a must and Cloudera distribution
  • Bi Tool integration with Hadoop
  • DBA experience HBASE
  • experience with database replication and scaling
  • Design, install, and maintain highly available systems (including monitoring, security, backup, and performance tuning)
  • Linux (RHEL) proficiency a must
  • Scripting experience
  • automation experience (chef/puppet)

Must possess good analytics and problem solving skills

Job Type: Contract

Experience:

  • Hadoop Engineer: 5 years (Preferred)

Application Question:

  • Can you work on W2?

Benefits:

  • None

Schedule:

  • Monday to Friday
Get new remote Hadoop contracts sent to you every week.
Subscribed to weekly Hadoop alerts! 🎉

Azure DevOps Engineer with Hadoop Data

5 days ago
RemoteRiya Software Consulting
  • Experience of IT service business and engineering of multi-year managed services
  • From 10-15 years plus of demonstrable enterprise level IT delivery experience
  • Cloud (5 years) in an engineering role using service and hosting solutions such as private/public cloud IaaS, PaaS and SaaS platforms. Experience in engineer technical solutions for Microsoft-centric solutions based on industry standards using Azure or other cloud providers IaaS, PaaS and SaaS capabilities. Nice to have Experience with any of the following: Azure, Azure Stack, Azure AD
  • Scripting (5 years) – having knowledge and experience with Microsoft PowerShell (required) and Linux shell scripting (preferred). Being able to understand and use best practices of completing tasks and finding solutions using existing or building new scripts to help automate today’s manual infrastructure and application tasks.
  • Functional knowledge of programming scripting and data science languages such as JavaScript, PowerShell, Python, Bash, SQL, .NET, Java, PHP, Ruby, PERL, C++, R, etc.
  • Understanding of CI/CD delivery using code management, configuration management and automation tools such as GitHub, VSTS, Ansible, DSC, Puppet, Ambari, Chef, Salt, Jenkins, Maven, etc.
  • Configuration Management (3 years) – being able to develop recipes for new solutions to deploy consistent systems, enforce configurations and settings through configuration management systems like PowerShell DSC, Chef or Puppet.
  • Tools & Environment: Hadoop Cloudera Manager, Hadoop Stack (Hive, Ranger, Atlas, Spark, NiFi, Impala), SQL Server 2012/2016, PowerBi Suite, Tableau.
  • Experience setting up a Hadoop Cluster environment, administrating including adding and removing cluster nodes, cluster capacity planning and performance tuning. Experience in running Hadoop jobs for processing multiple records of data, monitoring and sizing clusters, developing Hive queries to perform data analysis for large datasets and supporting various Solution Teams providing guidance around data quality and integrity.
  • Networking (1-3 years) - Have a solid understanding of networking capabilities, including load balancers, web application firewalls, network access control lists (NACLs), security groups, routing, tracing, DNS resolution are key to building efficient and stable solutions that prevent business downtime and provide high availability capabilities.
  • Documentation (2-3 years) - Being able to create technical documents that provide insight into the design and implementation of a solution provides teams with the ability to effectively communicate their needs and requirements across the organization. The candidate has to be able to produce clear and concise architecture and design documentation that can assist his team and other peer groups in understanding of the built solutions. The candidate must also be able to communicate complex technical issues with sensitivity to diverse audiences and people with different level of technical understanding that range from entry level support teams, management, and technical engineering resources.
  • Operational Support awareness (5 years) – A good understanding of what it takes to support the deployed applications and solutions is key to providing great service to the end users. The candidate has to be able to put themselves into a position of understanding of their consumer pain points to be able to figure out creative ways not only to find quick workarounds but also analyze the root cause of the problem and come up with ideas of resolving them in the long term.
  • Source code management (3 years) – Familiarity with source control tools such as Git, Team Foundation Server (TFVC), and SVN are a big part of automation and compliance. Understanding how these tools are and could be used is key to not only provide the proper change control management to new versions of products that are being developed, but also to help provide consistency in deploying those products/solutions throughout environments using automation.

*

Job Type: Contract

Experience:

  • Microsoft PowerShell: 5 years (Required)
  • Configuration Management: 3 years (Required)
  • Azure Cloud Engineering: 5 years (Required)
  • Hadoop Cloudera Manager, Hadoop Stack: 2 years (Required)

Work Location:

  • Fully Remote

Benefits:

  • None

Schedule:

  • Monday to Friday

Data Scientist(Remote)

1 month ago
$55 - $65/hourRemoteClairvoyant

Data Scientist #Contract/Contract to hire # Remote

Skills:
Python, Main SQL, Data Eng , Hands on AWS Required
Good to have R, AWS EMR, LOOKR

  • Proficient in one or more programming languages such as Python, Java, Scala, and R
  • Familiar with one or more machine learning or statistical modeling tools such as R, scikit learn, and Spark MLlib
  • Practical experience with distributed data platforms: Map/Reduce, Hadoop, SPARK
  • Knowledge and experience working with relational databases and SQL; Demonstrated flexibility in working with large, complex, and ambiguous datasets
  • Enhance our machine learning software with the latest in machine learning algorithms
  • Work successfully in a highly cross-functional environment
  • Strong analytical and quantitative problem solving ability.
  • Excellent communication, relationship skills and a strong teammate.
  • Engineer to put your research into practice
  • Take ownership of a system that is the core of our intelligence products
  • Experience on Data visualization tools such as Tableau and Power BI

Job Types: Full-time, Contract

Salary: $55.00 to $65.00 /hour

Experience:

  • AWS: 5 years (Required)
  • Data Scientist: 8 years (Required)
  • R: 5 years (Required)
  • Sql: 5 years (Required)
  • Python: 5 years (Required)

Work authorization:

  • United States (Required)

Work Remotely:

  • Yes

SAS Python Data Engineer ( GC holder or US citizen)

29 days ago
$55 - $65/hourRemoteCLS Bank International

GC holder or US citizen ONLY

SAS Python Data Engineer

Location : Remote - Pittsburgh, PA

Duration : 12 months plus extensions

Qualifications

  • At least 3 years of experience developing production Python code
  • A strong understanding of SQL
  • Experience with SAS
  • Solid understanding of software design principles

3-4 years’ experience in writing Production grade code building relevant data objects for ETL, data pipelines & APIs against a variety of data sources like SAS, Oracle, Teradata, Hadoop/Cloudera, Hive using SQL, Python, SAS DI

Job Types: Full-time, Contract

Salary: $55.00 to $65.00 /hour

Contract Length:

  • More than 1 year

Contract Renewal:

  • Likely

Work Location:

  • One location
  • Fully Remote

Benefits:

  • None

Visa Sponsorship Potentially Available:

  • No: Not providing sponsorship for this job

Schedule:

  • Monday to Friday

Work Remotely:

  • Temporarily due to COVID-19

AI Engineer

1 month ago
RemoteNUES LLC

Position Overview:

NUES LLC is seeking a Artificial Intelligence / Machine Learning Engineer to join a CRO team based in Maryland to develop a software platform that enhances recruitment in clinical trials.

Open to remote working. Preference for candidates willing to work in the DMV (DC, Maryland, Virginia) area.

Desired Skills/Experience:

- Architecture of Healthcare information

- Big Data/AI algorithms /natural language processing and/or machine learning.

- Fluent in programming skills such as Java, Scala, or Python.
- Experience transforming raw medical data into actionable patient information

- Experience deploying of Machine Learning, rule-based, and statistical models

- Familiar with developing data-driven insights and machine learning models

- Familiarity with identifying and extracting data from electronic medical records.

- Demonstrated ability to code in a modern language like Python

- AI technology in healthcare

- Experience with Spark or Hadoop

- Real world Evidence (RWE) from medical claims, labs, medical records and prescription data

- Demonstrated experience with computing systems such as AWS, Hadoop, Azure, Google Cloud

- Other responsibilities to meet objective of project

Qualifications:

- Master's degree or higher in Computer Science or other STEM related field

- 7+ years of relevant experience in software engineering and Machine learning.

- Strong time management, technical and organizational skills.

- Ability to work independently and within a team environment.

Job Types: Part-time, Temporary, Contract

Experience:

  • programming skills in python, Java, or scala: 5 years (Preferred)
  • deploying ML and statistical models into real-world app: 5 years (Preferred)
  • software engineering and Machine Learning: 5 years (Preferred)
  • AWS, Hadoop, Azure, Google Cloud: 5 years (Preferred)

Education:

  • Master's (Preferred)

Work authorization:

  • United States (Required)

Contract Length:

  • 5 - 6 months
  • 7 - 11 months

Data Engineer x 6

23 days ago
RemoteOutside IR35Talent International

(Outside IR35) Data Engineers x 6 - MUST HAVE VALID SC CLEARENCE

  • Initial 3 Month Contract
  • 100% Remote Working
  • Hadoop Programming
  • Python, Big Data, Spark tech stack
  • Current SC Clearance required

We're looking for experienced Data Engineers for a brand new greenfield project and huge data transformation, in light of COVID-19.

What you need to know

  • Proven experience of data engineering including data wrangling, profiling, preparation
  • Proven experience of big data environments, within the Hadoop Stack including data ingestion, processing and storage using HDFS, Spark, Hive, Python, Impala, Cloudera
  • experience of developing ETL functionality in a cloud or on-premise environment;
  • Experience of using tools such as python and SQL (Spark)

What you will be doing

  • Working with with members of the Data Engineering team to develop automated coding solutions for a range of ETL, data cleaning, structuring and validation processes.
  • Working with large semi-structured datasets to construct linked datasets derived from multiple underlying sources as well as supporting the wider team in delivering a range of data profiles across key strategic administrative data flows.
  • Assisting in a range of ETL and warehousing design projects

Next Steps

If this is relevant to you and something you would like to apply for - get in touch with James today

Big Data Engineer

1 month ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Wales

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1586960550

  • Published:

    28 days ago

  • Duration:

    3 Months

  • Expiry date:

    15 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python

  • Scala

  • Spark

  • Hadoop (Hive, Impala)

  • Extensive knowledge of Data Engineering and Architecture best practices

  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

1 month ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Hampshire

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587621849

  • Published:

    20 days ago

  • Duration:

    3 Months

  • Expiry date:

    23 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

Based in either Hampshire or Newport

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python
  • Scala
  • Spark
  • Hadoop (Hive, Impala)
  • Extensive knowledge of Data Engineering and Architecture best practices
  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

1 month ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Hampshire

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587040829

  • Published:

    25 days ago

  • Duration:

    3 Months

  • Expiry date:

    16 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

Based in either Hampshire or Newport

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python

  • Scala

  • Spark

  • Hadoop (Hive, Impala)

  • Extensive knowledge of Data Engineering and Architecture best practices

  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

1 month ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Wales

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587551850

  • Published:

    22 days ago

  • Duration:

    3 Months

  • Expiry date:

    22 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python
  • Scala
  • Spark
  • Hadoop (Hive, Impala)
  • Extensive knowledge of Data Engineering and Architecture best practices
  • Stakeholder Management

Please apply online for more details.

Cloud Geospatial data engineer - Remote (MS2111)

20 days ago
RemoteAkvelon, Inc.

Requirements:

Looking for high quality skilled individuals that have experience with big data geoscience and geospatial intelligence to support our work on environmental sustainability and conservation.

The position would be full time contract for a year, looking to start as soon as possible. We are open to any location but prefer to Western time zones

Required Skills:

  • Cloud geospatial data pipeline engineer
  • Responsible for on-boarding data to Azure
  • Fluency with Python required
  • Fluency with a cloud required (it doesn’t have to be Azure)
  • Fluency with big data geospatial platforms strongly preferred
  • Experience in geospatial intelligence/science Big Data
  • Cloud – distributed computing frameworks (Kubernetes, Hadoop, Spark, Dask, Azure Batch, etc.) required, though not necessarily fluency in all of the above
  • A Bachelor’s degree or higher is required for this role.

Status: 12 month contract
Location: Redmond, WA (Can be REMOTE)
Job Id : MS2111

Since 2000, Akvelon has specialized in placing top software engineering talent at Fortune 500 Companies and start-ups alike. We were ranked in Comparably’s 2018 list of Top 15 Best Companies in Seattle, and were voted one of the Puget Sound Business Journal’s fastest growing companies for several years.

Akvelon is an Equal Opportunity Employer - All qualified applicants will receive consideration. We do not discriminate on the basis of race, color, religion, gender, national origin, age, disability, veteran status, or any other factor determined to be unlawful under applicable law.

Job Types: Full-time, Contract

Work Location:

  • Fully Remote

Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Retirement plan

Schedule:

  • Monday to Friday

Big Data Lead Developer

20 days ago
$60 - $65/hourRemoteBinary tech consulting corp

This is a 100% Remote Role - Long Term Project.

Client is Seeking Bigdata Cloud Lead and Developer - for a 12+ Months Project - Working During PST Hours!

1. Lead - 12+ Yrs of Exp Required (ONLY USC)

2. Developer 7 to 9 Yrs of Exp Required

  • 6-8 Years of software development/programming experience in enterprise cloud based data applications [ 12 + exp of LEAD ROLE ]
  • 6-8 years of exp in data modelling, data design and persistence [ eg wearhouse, data marts , data lakes ] [ 12 + exp of LEAD ROLE ]
  • Experience to functional, imperative and object oriented languages and methodologies.
  • Experience with Supporting BIG DATA and Hadoop
  • Experience with Big Data approaches and technologies including;hadoop,cloudera utilities, spark, kafka,hive, oozie
  • Experience with SQL [ sql server , mySQL , Postgres] and NoSQL [ Cosmos/mongo/HBase]database expected
  • Exposure to programming languages/tools including c#,java,python,ruby,scala,sql and scripting [java,python,spark,sql,hive,java script,shell] Scripts
  • Distributed Systems Exp [ 4+ yrs desired ]
  • Knowledge of Various design patterns and technologies that enable business problem solving at scale
  • Great communication skills to drive collobrate cross group and work effectively within team.

Job Type: Contract

Salary: $60.00 to $65.00 /hour

Experience:

  • Distributed Systems: 4 years (Preferred)

Contract Length:

  • More than 1 year

Work Location:

  • Fully Remote

Benefits:

  • None

Schedule:

  • Monday to Friday

Work Remotely:

  • Yes

Business Intelligence Architect (Onsite / Remote locations)

28 days ago
RemoteInabia Software & consulting Inc.

Client : Redapt
*
*
Location: This position is flexible to working in-office in Woodinville or remotely in Atlanta, Dallas, Denver, Irvine or Cincinnati.

**

Required Technical Skills:

  • Ability to appropriately architect complex data solutions utilizing SQL Server architecture. This includes storage, replication, server tuning, upgrading, backup/restore, security, etc.
  • Ability to write complex SQL queries
  • TSQL knowledge, DML/DDL, triggers, CTE's, query tuning, etc.
  • Strong knowledge of SSIS: script tasks, checkpoints, recordset objects, package vs. project deployment.
  • Practical knowledge of how to design complex SQL Server Integration Services (SSIS) packages
  • Knowledge of SQL Server Reporting Service and Power BI.
  • Knowledge of Azure SQL, and other Azure offerings centered around Data and Analytics (Azure Machine Learning, Data Lake, Data Factory, Streaming Analytics, Table Storage, Hadoop, etc.)
  • BS or MS in Computer Science, Engineering or Mathematics preferred, equivalent work experience will be considered

Preferred:

  • Strong understanding of parallel processing utilizing Azure SQL Data Warehousing for both structured and unstructured data sources.
  • Experience writing U-SQL and Azure Data Lake Analytics
  • Experience with HDInsight, Spark, and Hadoop

Job Type: Contract

Experience:

  • TSQL: 2 years (Required)
  • HDInsight, Spark, and Hadoop: 1 year (Preferred)
  • Power BI.: 2 years (Required)
  • writing U-SQL and Azure Data Lake Analytics: 1 year (Preferred)
  • SQL Server: 3 years (Required)
  • Azure SQL: 2 years (Preferred)

Work Remotely:

  • Temporarily due to COVID-19

Spark SQL Data Consultant (Python, Teradata)

1 month ago
£500 - £540/dayRemoteIT Talent Solutions Ltd

IT Talent are representing a global Ecommerce organisation in London with over 150 million users worldwide. We are recruiting for an expert Data and Process Automation Engineer on an initial 6 month contract basis.

This is a high impact role within our client's Indirect tax technology team and will report to the Senior Manager for Analytics and Automation.

The post will be remote working initially.

Purpose of your role

  • Develop and Maintain complex VAT processes involving check for seller non-compliance, automated seller actions and Reporting
  • Design and Implement Process Automations
  • Gather business requirements for Process automations
  • Implement, review and optimize reporting processes and data analytics
  • Data visualization and analysis
  • Maintain a close and pro-active relationship with other business functions like Customer Services, Legal and Finance teams in respect of issues identified during the reporting and compliance process
  • Participate in accurate and timely data preparation for data reporting obligations and submission to tax authorities and ensure that our client fulfills the requirements to avoid joint and several liability for VAT

Skills required

  • Excellent technical skills for process development and automation, especially Python programming and Spark SQL
  • Proficient in handling big data (Hadoop, Teradata, etc.)
  • Proficient in Tableau and Excel
  • Agile experience
  • Ability to translate commercial requirements into software solutions
  • Capable to work independently while acting as part of a global tax team
  • Able to handle a high number of projects and prioritize

Experience required

  • 8-10 years' hands-on experience in process automation and data analytics
  • Proven knowledge in Python programming and Spark SQL

Please send us your CV for immediate consideration.

Keywords

tax Process Automation Engineer, Spark SQL, Python, Process automation, data analytics, tax, Hadoop, Teradata, Spark SQL, Spark SQL, Python, Process automation, data analytics, tax, Hadoop, Teradata, Spark SQL

Data Analytics Engineer

1 month ago
£500 - £600/dayRemoteHarnham

Data Analytics Engineer - Remote Working
6-month Contract
London/Home Working
£550 per day

As a Data Engineer, you will be building an on-prem data platform for a niche online security start-up.

THE COMPANY:
This company have a solid software product that has allowed them to attract lots of members who have been subscribing to their page. They now need to utilise this data to improve their service. They would like to build an analytical platform that will allow them to track and monitor subscriptions to better understand their customers and attract different market segments. You will be going in as the sole Data Engineer to advise on how this strategy could be devised and what they will need to build it.

THE ROLE:
As a Data Engineer, you will be required to build a Hadoop based solution and processing data using Spark. They platform will be hosting big data with the view for analysts to build their reports. You will be building ELT pipelines using Python and integrating data sets. The vision is to eventually have a single customer view but firstly the platform needs to be robust enough to house such large volumes of data.

YOUR SKILLS AND EXPERIENCE:
The ideal Data Engineer will have:

  • Expert coding skills in Python
  • An understanding of the Hadoop eco-system
  • To take a consultative approach on how to build an Analytics platform
  • A strong understanding of Spark

HOW TO APPLY:
Please submit your CV to Henry Rodrigues at Harnham via the Apply Now button.
Please note that our client is currently running a fully remote interview process, and able to on-board and hire remotely as well.

Big Data Engineer

1 month ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Wales

  • Sector:

    Analytics and Insight, Business Intelligence, Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587141068

  • Published:

    24 days ago

  • Duration:

    3 Months

  • Expiry date:

    17 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python

  • Scala

  • Spark

  • Hadoop (Hive, Impala)

  • Extensive knowledge of Data Engineering and Architecture best practices

  • Stakeholder Management

Please apply online for more details.

java/spark developer

27 days ago
Remotekoreminds llc

NOTE: -

· You will get Fully Remote till corona impacts, depends upon situation.

· ONLY w2 candidates(USC AND GC).

· No c2c please

Role: java/ spark developer

Location: Wilmington, DE

Duration: Contract TO Hire 6+ months (fulltime)

Years of Exp : 5- 8 years MAX

Job description

· 4 years plus experience with Java, Spark, Hadoop, Hive, SQL

· Proficient in Big Data application development.

· Working proficiency in Big Data tool set to design, develop, test, deploy, maintain and improve software.

· Demonstrates understanding of Agile methodologies with ability to work in at least one of the common framework.

· Demonstrates understanding of techniques such as Continuous Integration, Continuous Delivery, Test Driven Development, Cloud Development, application resiliency and security.

· Working proficiency in Spark, Java.

· Working proficiency in a portion of software engineering disciplines and demonstrates understanding of overall software skills including business analysis, development, testing, deployment, maintenance and improvement of software.

I believe you’d a great addition to our team and would like to know your interest and availability. If this is of your interest, revert with your latest CV for further review and interview processes.

Looking Forward to your immediate response…

Thanks & Regards

Anwar Gilani | KoreMinds Inc | Technical recruiter

250 International Pkwy, Lake Mary FL 32746-5030

Direct: - 315-344-1165

Website: https://www.koreminds.com/

Job Types: Full-time, Contract

Salary: $60.00 to $65.00 /year

Experience:

  • java/spark, hadoop: 4 years (Preferred)

Application Question:

  • w2 role, no c2c role, only usc and gc visa. contract to hire.

Work Remotely:

  • Temporarily due to COVID-19

Sr. Python Developer

24 days ago
$65 - $85/hourRemoteCatalyte

Position: Sr. Python Developer

Responsibilities:

The Engineer will actively participate with Scrum development teams and meetings. Additionally, the Engineer will be responsible for working with a highly functional team developing and automating data ingest, optimizing system and search performance, integration with enterprise authentication services, and establishing/improving system monitoring while maintaining established security protocols development, test, and production systems

· Senior Python Developer with good experience in Python, Pandas/NumPy/SciPy, RESTful/REST,

· Expertise in at least one popular Python framework (like Django, Flask, or Tornado) and Spark/Kafka/Hadoop (plus)

· Full Stack Engineer capable of designing solutions, writing code, testing code, automating test and deployment

· Overall delivery of software components working in collaboration with product and design teams

· Collaborating with other technology teams to ensure integrated end-to-end design and integration.

· Enforcing existing process guidelines; drives new processes, guidelines, team rules, and best practices.

· Ready, willing, and able to pick up new technologies and pitch in on story tasks (design, code, test, CI/CD, deploy etc.)

· Ensures efficient execution of overall product delivery by prioritizing, planning and tracking sprint progress. (This can include the development of shippable code

pQualifications:

  • Expert with Python Development
  • 10+ years of Python Development experience
  • Bachelor/Master’s Degree in Computer science or any related quantitative field.
  • Knowledgeable in cloud platforms (preferable AWS: both traditional EC2 and serverless Lambda)
  • Deep Experience with micro-services architecture, CI/CD solutions (including Docker), DevOps principles
  • Understanding of the threading limitations of Python, and multi-process architecture
  • Solid foundation and understanding of relational and NoSQL database principles.
  • Experience working with numerical/quantitative systems, e.g., pandas, NumPy, SciPy, and Apache Spark.
  • Experience in developing and using RESTful APIs.
  • Expertise in at least one popular Python framework (like Django, Flask, or Tornado)
  • Experience in writing automated unit, integration, regression, performance, and acceptance tests.
  • Solid understanding of software design principles
  • Proven track record of executing on the full product lifecycle (inception through deprecation) to create highly scalable and flexible RESTful APIs to enable an infinite number of digital products.
  • Self-directed with a start-up/entrepreneur mindset.
  • Ravenous about learning technology and problem-solving.
  • Strong writing and communication skills.

Job Type: Contract

Salary: $65.00 to $85.00 /hour

Experience:

  • Python: 8 years (Preferred)

Work authorization:

  • United States (Required)

Work Location:

  • Fully Remote

Benefits:

  • None

Schedule:

  • Monday to Friday

Work Remotely:

  • Yes

Senior Business Intelligence Architect (Remote)

28 days ago
$60 - $70/hourRemoteInabia Software & consulting Inc.

Senior Business Intelligence Architect

If you’re interested in designing modern cloud database solutions on Microsoft Azure and love helping customers solve complex problems related to Business Intelligence, Advanced Analytics, and Data Science, we’d like to hear from you!

We're looking for a versatile BI Data Architect who can design and implement modern architecture for a diverse set of customers and industries. The ideal candidate would have deep experience in designing enterprise data warehousing solutions utilizing, SQL Server, Azure SQL DW and Big Data platforms (i.e. Azure Data Lake, Azure Data Factory). This position will be responsible for leading customer conversations, creating and presenting project architecture, and leading delivery.

Primary Responsibilities:

  • Interfacing directly with clients to solve broad business goals with database solutions.
  • You will be responsible for gathering requirements, designing solutions, and overseeing the development and execution of projects.
  • This role will be directly involved in the business development process; specifically delivering customer demos to show the value of how data can drive business goals.
  • Be able to build pilot solutions and proof-of-concepts with minimal direction.
  • Provide support to project manager through developing tasks, estimates, and dependencies to meet expectations

Required Technical Skills:

  • Ability to appropriately architect complex data solutions utilizing SQL Server architecture. This includes storage, replication, server tuning, upgrading, backup/restore, security, etc.
  • Ability to write complex SQL queries
  • TSQL knowledge, DML/DDL, triggers, CTE's, query tuning, etc.
  • Strong knowledge of SSIS: script tasks, checkpoints, recordset objects, package vs. project deployment.
  • Practical knowledge of how to design complex SQL Server Integration Services (SSIS) packages
  • Knowledge of SQL Server Reporting Service and Power BI.
  • Knowledge of Azure SQL, and other Azure offerings centered around Data and Analytics (Azure Machine Learning, Data Lake, Data Factory, Streaming Analytics, Table Storage, Hadoop, etc.)
  • BS or MS in Computer Science, Engineering or Mathematics preferred, equivalent work experience will be considered

Preferred:

  • Strong understanding of parallel processing utilizing Azure SQL Data Warehousing for both structured and unstructured data sources.
  • Experience writing U-SQL and Azure Data Lake Analytics
  • Experience with HDInsight, Spark, and Hadoop

Who you are:

  • Excels with ambiguity and able to design a clear path to meet end goals
  • Have a broad understanding of BI and data technologies and how to match the right technologies to solutions.
  • Have proven experience with both the customer facing and solution engineering skill set

Job Type: Contract

Salary: $60.00 to $70.00 /hour

Experience:

  • * TSQL knowledge, DML/DDL: 5 years (Required)
  • Power BI: 3 years (Required)
  • BI architecture: 8 years (Required)

Schedule:

  • Monday to Friday

Data Warehouse Architect

24 days ago
$85 - $90/hourRemoteHGS Digital

Join HGS Digital and help build strategies and systems to help our clients solve their most complex and interesting business problems. If you are invigorated by new challenges, seek out new learning opportunities, lead from the front, challenge the status quo, and develop the skills of team members through technical and professional mentoring.

RESPONSIBILITIES

  • Lead analysis, architecture, design, and development of cloud data lake, warehouse and business intelligence solutions
  • Actively contribute to the cloud and big data community at HGS Digital and drive new capabilities forward
  • Define cloud data strategy, including designing multi-phased implementation roadmaps
  • Display passion for solving data-oriented problems in an analytical and iterative fashion that meet customers' needs
  • Big data architectures and patterns both on premises and in the cloud
  • Lead in MDM and Governance planning and workshops for data architecture and flow
  • Database design, development & management in relational, Data Lake, EDWs and NoSQL solutions
  • ETL/ELT design development with tools such as Informatica, Spark or Airflow
  • Data ingestion and management concepts of cataloguing, lifecycle and lineage
  • Working with various kinds of data (streaming, structured, unstructured, metrics, logs, json, xml, parquet, etc.)
  • Working in various agile methodologies (Scrum, Kanban, SAFe)
  • Working with approach, platforms and best practice for reporting and visualization tools
  • Assist business development teams with pre-sales activities, including helping estimate and plan projects

QUALIFICATIONS

  • 12+ years of IT experience & 3+ years of AWS cloud experience (additional Azure cloud experience is a plus)
  • Strong decision-making skills in terms of data analysis and must have the ability to architect large data
  • Deploying and monitoring scalable infrastructure in Amazon web services (AWS), DevOps tools and methodologies
  • Deep understanding and working experience in Big Data technologies such as Hadoop, Hortonworks & AWS Big Data solutions
  • Solid understanding of Big Data ETL
  • Experience in architecting solutions around AWS EMR, Kinesis (Data Streams/Analytics), Kafka, Spark, Hive, HDFS, Hadoop (Hortonworks), S3, RedShift, DynamoDB, RDS, Lambda
  • Experience with Informatica Cloud for ETL
  • Strong understanding of Datawarehousing Technologies & Concepts
  • Experience in Cloud DevOps
  • Experience in reporting tools such as PowerBI, Tableau, etc. is a must
  • Experience in ML / AI modeling preferred

This position is for Contract opportunity for 10-12 weeks.

Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by HGS Digital.

Job Type: Contract

Salary: $85.00 to $90.00 /hour

Experience:

  • AWS cloud: 3 years (Required)
  • IT: 10 years (Required)

Contract Length:

  • 2 months or less

Work Location:

  • Fully Remote

Benefits:

  • None

Schedule:

  • Monday to Friday

Company's website:

  • www.hgsdigital.com