Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.

Remote Big Data contract jobs

Remote
0
USD
/hr

19 remote Big Data contracts

Big Data Lead Developer

10 days ago
$60 - $65/hourRemoteBinary tech consulting corp

This is a 100% Remote Role - Long Term Project.

Client is Seeking Bigdata Cloud Lead and Developer - for a 12+ Months Project - Working During PST Hours!

1. Lead - 12+ Yrs of Exp Required (ONLY USC)

2. Developer 7 to 9 Yrs of Exp Required

  • 6-8 Years of software development/programming experience in enterprise cloud based data applications [ 12 + exp of LEAD ROLE ]
  • 6-8 years of exp in data modelling, data design and persistence [ eg wearhouse, data marts , data lakes ] [ 12 + exp of LEAD ROLE ]
  • Experience to functional, imperative and object oriented languages and methodologies.
  • Experience with Supporting BIG DATA and Hadoop
  • Experience with Big Data approaches and technologies including;hadoop,cloudera utilities, spark, kafka,hive, oozie
  • Experience with SQL [ sql server , mySQL , Postgres] and NoSQL [ Cosmos/mongo/HBase]database expected
  • Exposure to programming languages/tools including c#,java,python,ruby,scala,sql and scripting [java,python,spark,sql,hive,java script,shell] Scripts
  • Distributed Systems Exp [ 4+ yrs desired ]
  • Knowledge of Various design patterns and technologies that enable business problem solving at scale
  • Great communication skills to drive collobrate cross group and work effectively within team.

Job Type: Contract

Salary: $60.00 to $65.00 /hour

Experience:

  • Distributed Systems: 4 years (Preferred)

Contract Length:

  • More than 1 year

Work Location:

  • Fully Remote

Benefits:

  • None

Schedule:

  • Monday to Friday

Work Remotely:

  • Yes
Get new remote Big Data contracts sent to you every week.
Subscribed to weekly Big Data alerts! 🎉

Big Data Engineer

25 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Wales

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1586960550

  • Published:

    28 days ago

  • Duration:

    3 Months

  • Expiry date:

    15 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python

  • Scala

  • Spark

  • Hadoop (Hive, Impala)

  • Extensive knowledge of Data Engineering and Architecture best practices

  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

25 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Hampshire

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587621849

  • Published:

    20 days ago

  • Duration:

    3 Months

  • Expiry date:

    23 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

Based in either Hampshire or Newport

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python
  • Scala
  • Spark
  • Hadoop (Hive, Impala)
  • Extensive knowledge of Data Engineering and Architecture best practices
  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

25 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Hampshire

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587040829

  • Published:

    25 days ago

  • Duration:

    3 Months

  • Expiry date:

    16 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

Based in either Hampshire or Newport

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python

  • Scala

  • Spark

  • Hadoop (Hive, Impala)

  • Extensive knowledge of Data Engineering and Architecture best practices

  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

26 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Wales

  • Sector:

    Analytics and Insight, Business Intelligence, Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587141068

  • Published:

    24 days ago

  • Duration:

    3 Months

  • Expiry date:

    17 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python

  • Scala

  • Spark

  • Hadoop (Hive, Impala)

  • Extensive knowledge of Data Engineering and Architecture best practices

  • Stakeholder Management

Please apply online for more details.

Big Data Engineer

26 days ago
£500 - £600/dayRemoteLa Fosse Associates
  • Location:

    Wales

  • Sector:

    Data and Analytics

  • Job type:

    Contract

  • Salary:

    £500 - £600 per day

  • Contact:

    Sophie Faithfull

  • Contact email:

    sophie.faithfull@lafosse.com

  • Job ref:

    LFA - 50642_1587551850

  • Published:

    22 days ago

  • Duration:

    3 Months

  • Expiry date:

    22 May 00:00

  • Startdate:

    ASAP

Big Data Engineer - Python / Hadoop / Spark

CANDIDATES MUST HOLD ACTIVE SC CLEARANCE OR BE ELIGIBLE TO APPLY.

Due to Covid-19, all interviews will be held remotely and the initial period of the contract will be fully remote.

A Big Data Engineer is required for a greenfield programme. You must have experience developing cloud based Big Data platforms within a commercial setting and working alongside a multi-disciplinary team.

Experience Required:

  • Python
  • Scala
  • Spark
  • Hadoop (Hive, Impala)
  • Extensive knowledge of Data Engineering and Architecture best practices
  • Stakeholder Management

Please apply online for more details.

Data Engineer x 6

13 days ago
RemoteOutside IR35Talent International

(Outside IR35) Data Engineers x 6 - MUST HAVE VALID SC CLEARENCE

  • Initial 3 Month Contract
  • 100% Remote Working
  • Hadoop Programming
  • Python, Big Data, Spark tech stack
  • Current SC Clearance required

We're looking for experienced Data Engineers for a brand new greenfield project and huge data transformation, in light of COVID-19.

What you need to know

  • Proven experience of data engineering including data wrangling, profiling, preparation
  • Proven experience of big data environments, within the Hadoop Stack including data ingestion, processing and storage using HDFS, Spark, Hive, Python, Impala, Cloudera
  • experience of developing ETL functionality in a cloud or on-premise environment;
  • Experience of using tools such as python and SQL (Spark)

What you will be doing

  • Working with with members of the Data Engineering team to develop automated coding solutions for a range of ETL, data cleaning, structuring and validation processes.
  • Working with large semi-structured datasets to construct linked datasets derived from multiple underlying sources as well as supporting the wider team in delivering a range of data profiles across key strategic administrative data flows.
  • Assisting in a range of ETL and warehousing design projects

Next Steps

If this is relevant to you and something you would like to apply for - get in touch with James today

AI Engineer

29 days ago
RemoteNUES LLC

Position Overview:

NUES LLC is seeking a Artificial Intelligence / Machine Learning Engineer to join a CRO team based in Maryland to develop a software platform that enhances recruitment in clinical trials.

Open to remote working. Preference for candidates willing to work in the DMV (DC, Maryland, Virginia) area.

Desired Skills/Experience:

- Architecture of Healthcare information

- Big Data/AI algorithms /natural language processing and/or machine learning.

- Fluent in programming skills such as Java, Scala, or Python.
- Experience transforming raw medical data into actionable patient information

- Experience deploying of Machine Learning, rule-based, and statistical models

- Familiar with developing data-driven insights and machine learning models

- Familiarity with identifying and extracting data from electronic medical records.

- Demonstrated ability to code in a modern language like Python

- AI technology in healthcare

- Experience with Spark or Hadoop

- Real world Evidence (RWE) from medical claims, labs, medical records and prescription data

- Demonstrated experience with computing systems such as AWS, Hadoop, Azure, Google Cloud

- Other responsibilities to meet objective of project

Qualifications:

- Master's degree or higher in Computer Science or other STEM related field

- 7+ years of relevant experience in software engineering and Machine learning.

- Strong time management, technical and organizational skills.

- Ability to work independently and within a team environment.

Job Types: Part-time, Temporary, Contract

Experience:

  • programming skills in python, Java, or scala: 5 years (Preferred)
  • deploying ML and statistical models into real-world app: 5 years (Preferred)
  • software engineering and Machine Learning: 5 years (Preferred)
  • AWS, Hadoop, Azure, Google Cloud: 5 years (Preferred)

Education:

  • Master's (Preferred)

Work authorization:

  • United States (Required)

Contract Length:

  • 5 - 6 months
  • 7 - 11 months

AWS Cloud Engineer with Java ( GC Holder or US citizen)

5 days ago
$90 - $95/hourRemoteCLS Bank International

US citizens or GC Holders ONLY

Java & AWS Cloud Engineer

Location : REMOTE --- Herndon, VA

Duration : 2 plus years ongoing

  • At least 10 years of full-time experience in software development including design, coding, testing, and support
  • At least 3 years of Cloud infrastructure experience working with one or more of the following Amazon Web Services (AWS) Cloud services: EC2, Elastic Beanstalk , EMR, ECS, S3, SNS, SQS, Cloud Formation, Cloud watch, Lambda ( as many as possible)
  • At least 3 years of experience in: Java, JEE, Spring boot, Docker, Kubernetes
  • At least 1 years of experience in: Apigee, OAuth 2.0 and OpenID
  • At least 1 year of experience in any big data technologies
  • Hands-on experience of AWS architecture design, Data Management, System to System Data Integration
  • At least 2 years of experience with Agile, Kanban or Scrum methodologies

Preferred : -

  • Strong working knowledge and technical competencies of AWS
  • Ability to communicate clearly, effectively, persuasively with technology and business stakeholders
  • Experience with Master Data Management/Data Unification tools like tamr, Profisee, etc.

Job Types: Full-time, Contract

Salary: $90.00 to $95.00 /hour

Contract Length:

  • More than 1 year

Contract Renewal:

  • Likely

Work Location:

  • One location
  • Fully Remote

Benefits:

  • Health insurance

Visa Sponsorship Potentially Available:

  • No: Not providing sponsorship for this job

Schedule:

  • Monday to Friday

Work Remotely:

  • Yes

Cloud Geospatial data engineer - Remote (MS2111)

10 days ago
RemoteAkvelon, Inc.

Requirements:

Looking for high quality skilled individuals that have experience with big data geoscience and geospatial intelligence to support our work on environmental sustainability and conservation.

The position would be full time contract for a year, looking to start as soon as possible. We are open to any location but prefer to Western time zones

Required Skills:

  • Cloud geospatial data pipeline engineer
  • Responsible for on-boarding data to Azure
  • Fluency with Python required
  • Fluency with a cloud required (it doesn’t have to be Azure)
  • Fluency with big data geospatial platforms strongly preferred
  • Experience in geospatial intelligence/science Big Data
  • Cloud – distributed computing frameworks (Kubernetes, Hadoop, Spark, Dask, Azure Batch, etc.) required, though not necessarily fluency in all of the above
  • A Bachelor’s degree or higher is required for this role.

Status: 12 month contract
Location: Redmond, WA (Can be REMOTE)
Job Id : MS2111

Since 2000, Akvelon has specialized in placing top software engineering talent at Fortune 500 Companies and start-ups alike. We were ranked in Comparably’s 2018 list of Top 15 Best Companies in Seattle, and were voted one of the Puget Sound Business Journal’s fastest growing companies for several years.

Akvelon is an Equal Opportunity Employer - All qualified applicants will receive consideration. We do not discriminate on the basis of race, color, religion, gender, national origin, age, disability, veteran status, or any other factor determined to be unlawful under applicable law.

Job Types: Full-time, Contract

Work Location:

  • Fully Remote

Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Retirement plan

Schedule:

  • Monday to Friday

java/spark developer

16 days ago
Remotekoreminds llc

NOTE: -

· You will get Fully Remote till corona impacts, depends upon situation.

· ONLY w2 candidates(USC AND GC).

· No c2c please

Role: java/ spark developer

Location: Wilmington, DE

Duration: Contract TO Hire 6+ months (fulltime)

Years of Exp : 5- 8 years MAX

Job description

· 4 years plus experience with Java, Spark, Hadoop, Hive, SQL

· Proficient in Big Data application development.

· Working proficiency in Big Data tool set to design, develop, test, deploy, maintain and improve software.

· Demonstrates understanding of Agile methodologies with ability to work in at least one of the common framework.

· Demonstrates understanding of techniques such as Continuous Integration, Continuous Delivery, Test Driven Development, Cloud Development, application resiliency and security.

· Working proficiency in Spark, Java.

· Working proficiency in a portion of software engineering disciplines and demonstrates understanding of overall software skills including business analysis, development, testing, deployment, maintenance and improvement of software.

I believe you’d a great addition to our team and would like to know your interest and availability. If this is of your interest, revert with your latest CV for further review and interview processes.

Looking Forward to your immediate response…

Thanks & Regards

Anwar Gilani | KoreMinds Inc | Technical recruiter

250 International Pkwy, Lake Mary FL 32746-5030

Direct: - 315-344-1165

Website: https://www.koreminds.com/

Job Types: Full-time, Contract

Salary: $60.00 to $65.00 /year

Experience:

  • java/spark, hadoop: 4 years (Preferred)

Application Question:

  • w2 role, no c2c role, only usc and gc visa. contract to hire.

Work Remotely:

  • Temporarily due to COVID-19

Data Warehouse Architect

14 days ago
$85 - $90/hourRemoteHGS Digital

Join HGS Digital and help build strategies and systems to help our clients solve their most complex and interesting business problems. If you are invigorated by new challenges, seek out new learning opportunities, lead from the front, challenge the status quo, and develop the skills of team members through technical and professional mentoring.

RESPONSIBILITIES

  • Lead analysis, architecture, design, and development of cloud data lake, warehouse and business intelligence solutions
  • Actively contribute to the cloud and big data community at HGS Digital and drive new capabilities forward
  • Define cloud data strategy, including designing multi-phased implementation roadmaps
  • Display passion for solving data-oriented problems in an analytical and iterative fashion that meet customers' needs
  • Big data architectures and patterns both on premises and in the cloud
  • Lead in MDM and Governance planning and workshops for data architecture and flow
  • Database design, development & management in relational, Data Lake, EDWs and NoSQL solutions
  • ETL/ELT design development with tools such as Informatica, Spark or Airflow
  • Data ingestion and management concepts of cataloguing, lifecycle and lineage
  • Working with various kinds of data (streaming, structured, unstructured, metrics, logs, json, xml, parquet, etc.)
  • Working in various agile methodologies (Scrum, Kanban, SAFe)
  • Working with approach, platforms and best practice for reporting and visualization tools
  • Assist business development teams with pre-sales activities, including helping estimate and plan projects

QUALIFICATIONS

  • 12+ years of IT experience & 3+ years of AWS cloud experience (additional Azure cloud experience is a plus)
  • Strong decision-making skills in terms of data analysis and must have the ability to architect large data
  • Deploying and monitoring scalable infrastructure in Amazon web services (AWS), DevOps tools and methodologies
  • Deep understanding and working experience in Big Data technologies such as Hadoop, Hortonworks & AWS Big Data solutions
  • Solid understanding of Big Data ETL
  • Experience in architecting solutions around AWS EMR, Kinesis (Data Streams/Analytics), Kafka, Spark, Hive, HDFS, Hadoop (Hortonworks), S3, RedShift, DynamoDB, RDS, Lambda
  • Experience with Informatica Cloud for ETL
  • Strong understanding of Datawarehousing Technologies & Concepts
  • Experience in Cloud DevOps
  • Experience in reporting tools such as PowerBI, Tableau, etc. is a must
  • Experience in ML / AI modeling preferred

This position is for Contract opportunity for 10-12 weeks.

Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by HGS Digital.

Job Type: Contract

Salary: $85.00 to $90.00 /hour

Experience:

  • AWS cloud: 3 years (Required)
  • IT: 10 years (Required)

Contract Length:

  • 2 months or less

Work Location:

  • Fully Remote

Benefits:

  • None

Schedule:

  • Monday to Friday

Company's website:

  • www.hgsdigital.com

Business Consultant

17 days ago
RemoteDBO Technologies LLC.

Remote Work until covid-19

Description:

Description:

NOTE- this is a 6 month contract with possibility of going permanent
*Must be authorized to work in the US and not require sponsorship in the future

As part of US Commercial organization, this role is responsible for execution of patient services and Market Access data strategy for Immunology. Work in cross collaboration with Commercial patient services, Commercial Analytics, Trade & Distribution, Access & Reimbursement and external partners. Leads a team of Project Managers, and BSAs to support the strategic and tactical needs of Market Access organization.

Key responsibilities include:

  • Understand T&D and Access and Reimbursement business processes and gaps, identify demand and responsible for defining solutions to address business demand
  • Work collaboratively with HEOR, Medical Affairs and Commercial teams to plan and execute Value Base Engagement initiatives for Immunology portfolio
  • Responsible for Universal data contracting and works with Legal and Purchasing organizations to finalize these contracts with Data suppliers, Specialty Pharmacies, Analytic partners and Research study partners
  • Responsible for review and approval of data specifications, solution, process flow and data sharing from Privacy and Compliance team.
  • Responsible for coordinating implementation of Specialty pharmacy data integration solution with Commercial Data Lake team
  • Experience with data analysis, data profiling and data quality techniques
  • Responsible for successful transition of Data solutions to Commercial Data Lake operations team. Resolves issues in an appropriate and timely manner by working with Operations team
  • Be able to work independently with external stakeholders to manage Third Party agreements, Supplier evaluation and Compliance with AbbVie data policies
  • Establish and maintain high-quality relationships with all levels across the company and with external partners.
  • Support HEOR organization on Real World Evidence studies with external partners and internal analysis for Immunology patient services
  • Effectively manage communications and expectations with internal and external stakeholders.
  • Works directly with Market Access function to understand specific business needs. Collect complete and accurate requirements by means of interviews, workflow analyses, facilitated discussion to deliver data or technology solutions
  • Identifies and communicates needs and opportunities for business process improvements that can be enabled via data or technology.
  • Provide support to T&D and MABI team in preparing for Quarterly business reviews with Trade partners
  • Responsible for learning best practices of Specialty pharmacy data, data integration, aggregation, reporting and data quality monitoring and proposing ideas for Immunology business
  • Responsible for managing MSA, SOW/POCN reviews, coordination with internal stakeholders to get timely approvals.
  • Architects Immunology patient data integration by working with internal and external stakeholders. Responsible for coordinating with Patient Services data strategy and Commercial Data Lake Operations for Immunology brands.

Qualifications: Basic:

  • Bachelor’s Degree or equivalent experience, preferably in software engineering, business, information systems, or a discipline closely related to the client area served.
  • 7 -10 years overall experience including business analysis and project management.

Skills/Experience Required:

  • Exceptional interpersonal and communication and strong analytical skills. Proven ability to build trust, listen and ask effective questions.
  • Interest in technology trends and application of technology to improve end user’s experience.

Proven track record of effective project leadership and positive results. Ability to self-manage time and priorities

  • Experience or familiarity with Data Lake, Big data technologies and tools, Patient level data like SHA PTD, IMS APLD or Truven Mediscan is preferred.
  • Demonstrate deep knowledge of pharmaceutical and healthcare business and utilize the knowledge in the rapid advancement of agile, impactful, and cost-effective solutions
  • Experience working with Market Access, Specialty Pharmacy, Supply Chain/Trade organization is a plus.

INTAKE CALL NOTES:

  • Years of experience/education and/or certifications required: 7+ years’ experience, bachelor’s degree preferred, not required
  • Top 3-5 skills:

1: Business Consultant Analyst experience
2: Strong communication skills
3: Familiarity with Big data tools- (ability to pull data, look for trends-not data analysis)
4. Experience with Pharma Commercial and/or Market Access would be very beneficial"

Job Type: Contract

Salary: $0.00 /hour

Benefits:

  • None

Schedule:

  • Monday to Friday

Company's website:

  • www.designbuildoperate.com

Immediate Hiring for Mid Level / Senior Java Developer in Wi...

20 days ago
$40 - $75/hourRemoteKoreminds LLC

We are looking to hire personnel for Mid Level Java Developer to join our Client's team at Wilmington, DE Area.

6+ months contract to hire (Direct Hire), NO C2C

Job Description: -

  • 5+ years of experience with Java/Spring Open Source/SQL
  • 2+ years of experience with JavaScript, and modern JavaScript frameworks, such as Angular/JQuery/Bootstrap
  • Understanding of distributed concepts
  • 1+ years of experience with Restful Services / WebServices
  • Cassandra/ NoSQL Experience a PLUS
  • Messaging Experience a PLUS (MQ/Kafka)
  • Experience with development, deployment, and support of large-scale distributed applications in a mission-critical production environment
  • Excellent analytical, communication, organizational, and problem-solving skills coupled with a strong work ethic Desired Skills:
  • Test-infected attitude
  • Ability to translate business requirements into functional requirements documentation or work with light requirements in an iterative fashion using JIRA
  • Understanding of distributed systems, big data concepts, patterns
  • Exposure to Open Source technologies Maven, Subversion, GIT, Jenkins
  • Working knowledge of Oracle RDBMS

#JavaDeveloper #Java #Spring #Source/SQL #JavaScript #JavaScriptframeworks #Angular #JQuery #Bootstrap #RestfulServices #WebServices #NoSQL #MQ/Kafka

Job Types: Full-time, Part-time, Temporary, Contract

Salary: $40.00 to $75.00 /hour

Experience:

  • software development: 5 years (Preferred)
  • Java: 5 years (Preferred)

Work authorization:

  • United States (Required)

Required travel:

  • 75% (Preferred)

Contract Renewal:

  • Likely

Full Time Opportunity:

  • Yes

Work Location:

  • Multiple locations
  • Fully Remote

Benefits:

  • Other

Schedule:

  • Monday to Friday

Company's website:

  • https://www.koreminds.com/

Benefit Conditions:

  • Waiting period may apply

Work Remotely:

  • Temporarily due to COVID-19

Spark SQL Data Consultant (Python, Teradata)

21 days ago
£500 - £540/dayRemoteIT Talent Solutions Ltd

IT Talent are representing a global Ecommerce organisation in London with over 150 million users worldwide. We are recruiting for an expert Data and Process Automation Engineer on an initial 6 month contract basis.

This is a high impact role within our client's Indirect tax technology team and will report to the Senior Manager for Analytics and Automation.

The post will be remote working initially.

Purpose of your role

  • Develop and Maintain complex VAT processes involving check for seller non-compliance, automated seller actions and Reporting
  • Design and Implement Process Automations
  • Gather business requirements for Process automations
  • Implement, review and optimize reporting processes and data analytics
  • Data visualization and analysis
  • Maintain a close and pro-active relationship with other business functions like Customer Services, Legal and Finance teams in respect of issues identified during the reporting and compliance process
  • Participate in accurate and timely data preparation for data reporting obligations and submission to tax authorities and ensure that our client fulfills the requirements to avoid joint and several liability for VAT

Skills required

  • Excellent technical skills for process development and automation, especially Python programming and Spark SQL
  • Proficient in handling big data (Hadoop, Teradata, etc.)
  • Proficient in Tableau and Excel
  • Agile experience
  • Ability to translate commercial requirements into software solutions
  • Capable to work independently while acting as part of a global tax team
  • Able to handle a high number of projects and prioritize

Experience required

  • 8-10 years' hands-on experience in process automation and data analytics
  • Proven knowledge in Python programming and Spark SQL

Please send us your CV for immediate consideration.

Keywords

tax Process Automation Engineer, Spark SQL, Python, Process automation, data analytics, tax, Hadoop, Teradata, Spark SQL, Spark SQL, Python, Process automation, data analytics, tax, Hadoop, Teradata, Spark SQL

Data Analytics Engineer

25 days ago
£500 - £600/dayRemoteHarnham

Data Analytics Engineer - Remote Working
6-month Contract
London/Home Working
£550 per day

As a Data Engineer, you will be building an on-prem data platform for a niche online security start-up.

THE COMPANY:
This company have a solid software product that has allowed them to attract lots of members who have been subscribing to their page. They now need to utilise this data to improve their service. They would like to build an analytical platform that will allow them to track and monitor subscriptions to better understand their customers and attract different market segments. You will be going in as the sole Data Engineer to advise on how this strategy could be devised and what they will need to build it.

THE ROLE:
As a Data Engineer, you will be required to build a Hadoop based solution and processing data using Spark. They platform will be hosting big data with the view for analysts to build their reports. You will be building ELT pipelines using Python and integrating data sets. The vision is to eventually have a single customer view but firstly the platform needs to be robust enough to house such large volumes of data.

YOUR SKILLS AND EXPERIENCE:
The ideal Data Engineer will have:

  • Expert coding skills in Python
  • An understanding of the Hadoop eco-system
  • To take a consultative approach on how to build an Analytics platform
  • A strong understanding of Spark

HOW TO APPLY:
Please submit your CV to Henry Rodrigues at Harnham via the Apply Now button.
Please note that our client is currently running a fully remote interview process, and able to on-board and hire remotely as well.

Hadoop Engineer ( 100% Remote)

5 days ago
RemoteGimmko Technologies

Hadoop Engineer
Location: 100% Remote

Duration: 12-18+ months+

VISA: Citizen, GC, GC-EAD, H4,
Job Description

  • Apache Hadoop and Cloudera distribution a must
  • Has to be a hard core engineer. No developers, architects, or admins. Has to be and engineer who can “fix” and push back on the developers.

Analyzes, designs, creates and implements Big Data infrastructures, including access methods, device allocations, validation checks, organization and security. Designs data models, logical and physical infrastructure designs, etc. Assists in system planning, scheduling, and implementation. Initiates corrective actions to stay on schedule. Installs, upgrades, and tests complex big data deployments. Develops and implements recovery plans and procedures. Disciplines: Hadoop design and analysis.

Involved in the analysis, design, development and implementation of software applications. Determines user requirements, leads application design, plans projects, establishes priorities and monitors progress.

Skills:

  • Solid administrative knowledge of Apache Hadoop a must and Cloudera distribution
  • Bi Tool integration with Hadoop
  • DBA experience HBASE
  • experience with database replication and scaling
  • Design, install, and maintain highly available systems (including monitoring, security, backup, and performance tuning)
  • Linux (RHEL) proficiency a must
  • Scripting experience
  • automation experience (chef/puppet)

Must possess good analytics and problem solving skills

Job Type: Contract

Experience:

  • Hadoop Engineer: 5 years (Preferred)

Application Question:

  • Can you work on W2?

Benefits:

  • None

Schedule:

  • Monday to Friday

Sr. Google Cloud & AWS Cloud Engineer (GC Holder or US citiz...

25 days ago
$95 - $105/hourRemoteCLS Bank International

GC holder or US citizens ONLY

Google Cloud and AWS Cloud Architect

Duration : 1 plus year ongoing contract

Location : REMOTE - Herndon, VA

As a GCP/AWS cloud SME, we are looking for someone who is passionate about Infrastructure/Network buildout security, IAM & Automation.

Having Deep knowledge and hands on experience in GCP network services, VPC, Hybrid connectivity using Anthos, application security & IAM. You will bring solid experience with GCP serverless functionality Cloud functions, monitoring, Terraform, CFTs - Cloud formation Templates (yaml), Docker, Kubernetes, pub/sub, storage and compute & Chatbot.

Responsibilities & Expectations:

  • Strong experience in configuring and deploying Google Cloud infrastructure consisting of network architecture, application security, identity and access management, logging, monitoring, and more.
  • Strong Experience with infrastructure configuration management tools Terraform, Ansible & Cloud Formation Templates.
  • Strong networking principles and protocols such as IP subnetting, routing, firewall rules, Virtual Private Cloud, LoadBalancer, Cloud DNS, Cloud CDN, etc.(as many as possible)
  • Extensive Experience hosting an application on GCP using Compute Engine, App Engine, Cloud SQL, Kubernetes Engine, Cloud Storage. (as many as possible)
  • Strong knowledge of Application Monitoring using Google Stackdriver & Splunk integration

Basic Qualifications:

  • At least 7+ years’ of hands-on Cloud Infrastructure experience (AWS , GCP)
  • At least 7+ years’ of hands-on Software development & Application hosting experience (GCP).
  • At least 5+ years’ of hands-on experience in writing GCP Anthos, cloud functions & AWS Lambda.
  • At least 3 years’ of experience with GCP Terraform & AWS Cloud formation templates (Yaml)
  • At least 3 years’ of experience with Kubernetes and microservices implementations
  • At least 3 years’ of experience with ELK (Elastic Search, LogStash & Kibana) & Splunk

Preferred Qualifications:

  • Experience with AI - Chatbot and Big Data on GCP – BigQuery, Pub/Sub, Dataproc, Dataflow.
  • Extensive experience working with container technology such as Docker, version control systems (Git/BitBucket), build management and CI/CD tools (Jenkins).
  • Proficient in a modern scripting language for Infrastructure automation.
  • Best design, deploy and optimize FM cloud network infrastructure for high availability, reliability, performance and scale

Job Types: Full-time, Contract

Salary: $95.00 to $105.00 /hour

Contract Length:

  • More than 1 year

Contract Renewal:

  • Likely

Work Location:

  • One location
  • Fully Remote

Benefits:

  • Health insurance

Schedule:

  • Monday to Friday

Work Remotely:

  • Temporarily due to COVID-19

Senior Business Intelligence Architect (Remote)

18 days ago
$60 - $70/hourRemoteInabia Software & consulting Inc.

Senior Business Intelligence Architect

If you’re interested in designing modern cloud database solutions on Microsoft Azure and love helping customers solve complex problems related to Business Intelligence, Advanced Analytics, and Data Science, we’d like to hear from you!

We're looking for a versatile BI Data Architect who can design and implement modern architecture for a diverse set of customers and industries. The ideal candidate would have deep experience in designing enterprise data warehousing solutions utilizing, SQL Server, Azure SQL DW and Big Data platforms (i.e. Azure Data Lake, Azure Data Factory). This position will be responsible for leading customer conversations, creating and presenting project architecture, and leading delivery.

Primary Responsibilities:

  • Interfacing directly with clients to solve broad business goals with database solutions.
  • You will be responsible for gathering requirements, designing solutions, and overseeing the development and execution of projects.
  • This role will be directly involved in the business development process; specifically delivering customer demos to show the value of how data can drive business goals.
  • Be able to build pilot solutions and proof-of-concepts with minimal direction.
  • Provide support to project manager through developing tasks, estimates, and dependencies to meet expectations

Required Technical Skills:

  • Ability to appropriately architect complex data solutions utilizing SQL Server architecture. This includes storage, replication, server tuning, upgrading, backup/restore, security, etc.
  • Ability to write complex SQL queries
  • TSQL knowledge, DML/DDL, triggers, CTE's, query tuning, etc.
  • Strong knowledge of SSIS: script tasks, checkpoints, recordset objects, package vs. project deployment.
  • Practical knowledge of how to design complex SQL Server Integration Services (SSIS) packages
  • Knowledge of SQL Server Reporting Service and Power BI.
  • Knowledge of Azure SQL, and other Azure offerings centered around Data and Analytics (Azure Machine Learning, Data Lake, Data Factory, Streaming Analytics, Table Storage, Hadoop, etc.)
  • BS or MS in Computer Science, Engineering or Mathematics preferred, equivalent work experience will be considered

Preferred:

  • Strong understanding of parallel processing utilizing Azure SQL Data Warehousing for both structured and unstructured data sources.
  • Experience writing U-SQL and Azure Data Lake Analytics
  • Experience with HDInsight, Spark, and Hadoop

Who you are:

  • Excels with ambiguity and able to design a clear path to meet end goals
  • Have a broad understanding of BI and data technologies and how to match the right technologies to solutions.
  • Have proven experience with both the customer facing and solution engineering skill set

Job Type: Contract

Salary: $60.00 to $70.00 /hour

Experience:

  • * TSQL knowledge, DML/DDL: 5 years (Required)
  • Power BI: 3 years (Required)
  • BI architecture: 8 years (Required)

Schedule:

  • Monday to Friday