HBase contract jobs near you


Big Data Developer

4 days ago
$60 - $65/hourAshburn, VARPD Systems

Responsibilities and Duties
1. Understanding business requirements & helping in assessing them with the development teams.
2. Creating high quality documentation supporting the design/coding tasks.
3. Participate in architecture / design discussions and Develop the ETL/ELT using PySpark and
4. Conduct code and design reviews and provide review feedback.
5. Identify areas of improvement in framework and processes and strive them to make better.
Key Skills
Desired Skill:   1. Airflow  2. Understanding Object oriented programming  3. Devops implementation knowledge  4. Git Commands  5. Python Sphinx, Pandas, SQL Alchemy , Mccabe, Unitest etc.. Modules
Required Experience and Qualifications
1. At least 3 years working experience in a Big Data Environment
2. Knowledge on design and development best practices in datawarehouse environments
3. Experience developing large scale distributed computing systems
4. Knowledge of Hadoop ecosystem and its components – HBase, Pig, Hive, Sqoop, Flume, Oozie,
5. Experience of Pyspark & Spark SQL
6. Experience with integration of data from multiple data sources
7. Implement ETL process in Hadoop (Develop big data ETL jobs that ingest, integrate, and export
data.) Converting Teradata SQL to PySpark SQL.
8. Experience in Presto, Kafka, Nifi.

Job Type: Contract

Salary: $60.00 to $65.00 /hour


  • Data Developer: 4 years (Preferred)

Last 90 Days

Get new remote HBase contracts sent to you every week.
Subscribed to weekly HBase alerts! 🎉 You can see it in your dashboard.

Azure Big Data Developer

20 days ago
$65 - $75/hour (Estimated)Reston, VADAtec Solutions

Essential Job Functions:

  • Production experience in large-scale SQL, NoSQL data infrastructures such as Cosmos DB, Cassandra, MongoDB, HBase, CouchDB, Apache Spark etc.
  • Application experience with SQL databases such as Azure SQL Data Warehouse, MS SQL, Oracle, PostgreSQL, etc.
  • Proficient understanding of code versioning tools {such as Git, CVS or SVN}
  • Strong debugging skills with the ability to reach out and work with peers to solve complex problems
  • Ability to quickly learn, adapt, and implement Open Source technologies.
  • Familiarity with continuous integration (DevOps)
  • Proven ability to design, implement and document high-quality code in a timely manner.
  • Excellent interpersonal and communication skills, both written and oral.

Educational Qualifications and Experience:

  • Role Specific Experience: 2+ years of experience in Big Data platform development.

Certification Requirements (desired):

Azure Designing and Implementing Big Data Analytics Solutions

Required Skills/Abilities:

  • Experience with NoSQL databases, such as HBase, Cassandra or MongoDB.
  • Proficient in designing efficient and robust ETL/ELT using Data Factory, workflows, schedulers, and event-based triggers.
  • 1+ experience with SQL databases (Oracle, MS SQL, PostgreSQL, etc.).
  • 1+ years of hands on experience with data lake implementations, core modernization and data ingestion.
  • 3+ years of Visual Studio C# or core Java.
  • Experience at least in one of the following programming languages: R, Scala, Python, Clojure, F#.
  • 1+ years of experience in Spark systems.
  • Good understanding of multi-temperature data management solution.
  • Practical knowledge in design patterns.
  • In depth knowledge of developing large distributed systems.
  • Good understanding of DevOps tools and automation framework.

Desired Skills/Abilities (not required but a plus):

  • Experience in designing and implementing scalable, distributed systems leveraging cloud computing technologies.
  • Experience with Data Integration on traditional and Hadoop environments.
  • Experience with Azure Time Series Insights.
  • Some knowledge of machine learning tools and libraries such as Tensor flow, Turi, H2O, Spark ML lib, and Carrot (R).
  • Understanding of AWS data storage and integration with Azure.
  • Some knowledge of graph database.

Job Type: Contract

Big Data Architect - Hadoop/cloudera

1 month ago
$60 - $75/hour (Estimated)Herndon, VA 20171menschForce LLC

Big Data Architect - Hadoop/cloudera

BigData Architect

- Minimum 7-9 years of total experience & 2-3 years in Hadoop

- Thorough understanding of Hadoop Cloudera/HortonWorks/MapR and ecosystem components

- Thorough understanding of No SQL databases like HBase, Mongo, Cassandra etc

- Requirements Gathering, designing and developing scalable big data solutions with Hadoop

- Strong technical skills on Spark, HBase, Hive, Sqoop, Oozie, Flume, Java, Pig, Python etc

- Good experience with distributed systems, large scale non-relational data stores, map-reduce systems, performance tuning, and multi-terabyte data warehouses

- Able to work independently and mentor team members

- Hands on development experience in Hadoop

- Expertise in different Unix flavors

- Effective communication

Job Types: Full-time, Contract

Salary: $70,000.00 to $120,000.00 /year


  • aws: 7 years (Preferred)

Work authorization:

  • United States (Preferred)

Required travel:

  • 75% (Preferred)