Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.

Remote Data Lake contract jobs

Remote
0
USD
/hr

5 remote Data Lake contracts

Azure Data Engineer

6 days ago
$60 - $75/hour (Estimated)RemoteSamay Consulting

One of our Fortune 500 client based out of Oregon is looking to hire 3 Azure Data Engineers on a full time basis. Prior experience working on Azure is a must

Area of Responsibility

  • Participate in design, implementation, and support of a data warehouse and analytics platform utilizing Azure cloud technology
  • Design and implement data load processes from On Premises sources into Azure Data Lake and subsequent Azure SQL & SQL Data Warehouse
  • Migrate existing processes and data from our On Premises SQL Server and other environments to Azure Data Lake
  • Design, develop, and support Azure SQL Database Data Marts for functional area data consumers
  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using Microsoft SSIS and/or Azure cloud technology.
  • Troubleshoot and support Microsoft SSIS ETL processes for On Premises SQL Data Warehouse
  • Explore and learn the latest Azure technologies to provide new capabilities and increase efficiency
  • Work with top-notch technical professionals developing complex systems at scale and with a focus on sustained operational excellence
  • Collaborate with DW developers; for products that require reporting data and ensure that datasets are in place and are used consistently internally/externally
  • Collaborate with data governance; ensure all existing data is created in the right way, and that new data is created according to appropriate standards and with proper documentation
  • Read, write, and configure code for end-to-end service telemetry, alerting and self-healing capabilities
  • Strive for continuous improvement of code quality and development practices
  • Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers

Job Types: Full-time, Contract

Pay: $100,000.00 - $150,000.00 per year

Benefits:

  • 401(k)
  • 401(k) Matching
  • Dental Insurance
  • Disability Insurance
  • Employee Discount
  • Flexible Schedule
  • Flexible Spending Account
  • Health Insurance
  • Paid Time Off
  • Retirement Plan
  • Vision Insurance

Schedule:

  • Monday to Friday

Experience:

  • Azure: 1 year (Preferred)

Application Question:

  • Work is remote but you may be asked to travel during Critical Go Live (Which is less than 10% of the time). Are you open for the limited travel?

Contract Length:

  • More than 1 year

Work Location:

  • Fully Remote

Visa Sponsorship Potentially Available:

  • Yes: Other non-immigrant work authorization (e.g. L-1, TN, E-3, O-1, etc.)
  • No: Not providing sponsorship for this job

Company's website:

  • www.samayconsulting.com

Benefit Conditions:

  • Only full-time employees eligible

Work Remotely:

  • Yes
Get new remote Data Lake contracts sent to you every week.
Subscribed to weekly Data Lake alerts! 🎉

Azure Engineer - 100% Remote

3 days ago
$70 - $100/hourRemoteeTek IT Service | Savvysol

Contract on W2

Responsibilities:

  • Attend planning and scrum meetings with the DevOps team and Data Lake Product Dev teams.
  • Assess Azure Data Factory data ingestion pipelines and make recommendations for improvement.
  • Assess the Azure DevOps tools and configurations for Continuous Integration/Continuous Delivery.
  • Liaison only. Will not be coding or creating any data pipelines or doing any DevOps hands-on work.

Must haves:

  • Azure Data Factory (ADF)
  • Azure Data Lake Store (ADLS)
  • Azure DevOps (ADO) Jenkins like tool

Nice to haves:

  • Azure Data Bricks
  • Hadoop or Big Data experience
  • GitHub source control experience
  • Continuous Integration / Continuous Delivery (CI/CD)

Job Types: Full-time, Contract

Salary: $70.00 to $100.00 /hour

Experience:

  • Azure DevOps: 5 years (Preferred)
  • Hadoop or Big Data: 4 years (Preferred)
  • Azure Data Lake Store: 4 years (Preferred)
  • Azure Data Factory: 4 years (Preferred)
  • Azure: 7 years (Preferred)

Application Question:

  • Are you comfortable to W2?

Work Remotely:

  • Yes

Data Engineer

1 month ago
RemoteGeorgia IT Inc.

We are looking for strong Data Engineers, skilled in Hadoop, Scala, Spark, Kafka, Python, and AWS. I've included the job description below.
Here is what we are looking for:

Overall Responsibility:

  • Develop sustainable data driven solutions with current new gen data technologies to meet the needs of our organization and business customers.
  • Apply domain driven design practices to build out data applications. Experience in building out conceptual and logical models.
  • Build out data consumption views and provisioning self-service reporting needs via demonstrated dimensional modeling skills.
  • Measuring data quality and making improvements to data standards, helping application teams to publish data in the correct format so it becomes easy for downstream consumption.
  • Big Data applications using Open Source frameworks like Apache Spark, Scala and Kafka on AWS and Cloud based data warehousing services such as Snowflake.
  • Build pipelines to enable features to be provisioned for machine learning models. Familiar with data science model building concepts as well as consuming and from data lake.

Basic Qualifications:

  • At least 8 years of experience with the Software Development Life Cycle (SDLC)
  • At least 5 years of experience working on a big data platform
  • At least 3 years of experience working with unstructured datasets
  • At least 3 years of experience developing microservices: Python, Java, or Scala
  • At least 1 year of experience building data pipelines, CICD pipelines, and fit for purpose data stores
  • At least 1 year of experience in cloud technologies: AWS, Docker, Ansible, or Terraform
  • At least 1 year of Agile experience
  • At least 1 year of experience with a streaming data platform including Apache Kafka and Spark

Preferred Qualifications:

  • 5+ years of data modeling and data engineering skills
  • 3+ years of microservices architecture & RESTful web service frameworks
  • 3+ years of experience with JSON, Parquet, or Avro formats
  • 2+ years of creating data quality dashboards establishing data standards
  • 2+ years experience in RDS, NOSQL or Graph Databases
  • 2+ years of experience working with AWS platforms, services, and component technologies, including S3, RDS and Amazon EMR

Job Type: Contract

Schedule:

  • Monday to Friday

Experience:

  • AWS: 1 year (Preferred)
  • Hadoop: 1 year (Required)
  • Spark: 1 year (Required)
  • Big Data: 1 year (Preferred)
  • Scala: 1 year (Preferred)
  • Data Engineering: 1 year (Required)

Contract Renewal:

  • Possible

Full Time Opportunity:

  • Yes

Work Location:

  • Fully Remote

Work Remotely:

  • Yes

Data Architect - Remote

4 days ago
$65 - $66/hourRemoteTechhire Global
  • Minimum 5 years of experience with Azure data platform including Azure SQL, Azure Data Factory, Azure Databricks, Azure SQL Data Warehouse (Synapse Analytics), Azure Data Lake Storage, Azure Cosmos DB, SQL Server, SSIS, SSRS, etc.
  • Strong Object/relational mapping experience with UML modeling and OO modeling.
  • Demonstrated experience of estimation and planning
  • Excellent written and verbal communication skills
  • Minimum 5 years of experience with T-SQL, Power BI and Azure Analysis Services
  • Experience working with Healthcare data and proven track of implementation experience in EDI data formats.
  • Minimum 6 years in job roles involving metadata management, relational/dimensional modeling and big data solution approaches with native Azure Data Platform tools or 1st party services.

Job Type: Contract

Salary: $65.00 to $66.00 /hour

Java/AWS Sage Maker Developer

1 month ago
$55 - $60/hourRemoteUnited Business Solutions Inc

Job Title : Java/AWS SageMaker Developer

Location : Atlanta, GA (will be remote until go back to normal)

Duration : 6 months

They must have the following:

1) 10 + years in Java

2) 5 + years in AWS

3) UNIX Scripting Experience

1. Self-starter, seasoned JAVA developer with working UNIX scripting knowledge (10+ years preferred ). Groovy knowledge is desired.

2. Working AWS & cloud computing knowledge required (5+ years preferred ). Working knowledge of SageMaker ML service is required.

3. Few years of hands-on experience in Virtual Assistant/Bot, & NLP/NLU technology is desired.

4. AI experience related to solving business problems through dialogue conversation, machine learning, data mining is desired.

5. Excellent critical thinking skills, combined with the ability to present your ideas clearly in both verbal and written form.

6. Self-motivated, Self-starter, quick learner, and team player and easy to work with is a must.

7. Work is based in the Atlanta office. The candidate can work remotely in current pandemic situation, with daily check-ins with other member of the team working on the project

Paul(at)ubsols.com

Job Type: Contract

Pay: $55.00 - $60.00 per hour

Experience:

  • Sage Maker: 2 years (Required)
  • AWS: 2 years (Required)
  • Java Developer: 10 years (Required)
  • UNIX Scripting: 2 years (Required)

Contract Length:

  • 1 year

Contract Renewal:

  • Likely

Full Time Opportunity:

  • No

Work Location:

  • One location

Benefits:

  • None

Schedule:

  • Monday to Friday

Work Remotely:

  • Temporarily due to COVID-19