Post a contract
Jobs
Database
Pricing
Log in
Sign up
Terms & Conditions
/
Privacy
/
Cookies
This job has expired.
Value Info Tech
Expired
Remote - BigData Engineer
Remote
Amazon AWS
Apache
Big Data
Data Warehouse
ETL
GCP
Google Cloud Platform
Hadoop
Java
JIRA
Kafka
NoSQL
Python
REST
Scala
Scrum
SOAP
SQL
Workflow
Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.
Job details
Job Type
Contract
Full Job Description
Work Authorization
US Citizen
GC
H1B
L2 EAD, H4 EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
DOE
Duration
:
12 Months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
BigData
Consulting / Contract
H1B OK
Third Party OK
Direct Client Requirement
Remote Work from Home
Required Skills
:
Big Data, Python,
Agile
, Apache, Data Warehousing, ETL, Google Cloud Platform, Hadoop, JAVA, Kafka, Scala, SCRUM, SOAP, SQL, Windows Azure
Preferred Skills
:
Domain
:
Job Description
:
Experience in developing relevant Big Data/ETL data warehouse and building cloud-native data pipelines.
Prior experience in developing Agile/Scrum applications using Jira.
Sound knowledge of Hive, Spark, Scala/Java/Python, and SQL.
Prior experience in Object and Functional programming using Python.
Good understanding of REST and SOAP-based APIs to extract data for data pipelines.
Expertise in Hadoop and related processing frameworks, such as Spark, Hive, Sqoop, etc.
Prior experience in working in a public cloud environment, i.e., GCP, AWS, or Azure.
Hands-on experience in working with real-time data streams and Kafka platform.
Good knowledge of workflow orchestration tools, such as Apache Airflow design and deploy Directed Acyclic Graphs (DAGs).
Other Amazon AWS contracts
What
Where
Remote
Your rate
0
USD
/hr
Recruiters
No recruiters
Recruiters okay
Filters
0
outside IR35
Amazon AWS
contracts