This job has expired.
Park Computer Systems Inc
Expired

Java / Big data developer- Junior/ Mid level

Dulles, VA (On-site)

Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.

Only W2, Junior/ Mid level candidate will work. Coding test is mandatory. Contract to hire role. No visa sponsorship.

The Software Engineer performs creating the design, development, testing, documentation, and analysis of software applications. Assignments include the development of new programs and subprograms, as well as enhancements, modifications, and corrections to existing software. Duties include designing applications, writing code, developing and testing and debugging and documenting work and results.

Responsibilities:

· Exercise good technical expertise and strong business skills, and lead the team in delivering data warehouse solutions.
· Warehouse Design - Work with the Architects to understand and implement a solid, extensible warehouse design that supports the new analytics requirements.
· Lead, participate in gathering business requirements, analysis of source systems, define underlying data sources and transformation requirements, design suitable data model, and develop metadata for the Data Warehouse/Analytical Reporting.
· Prepare Technical Design / Specifications; develop routines for data extraction and loading.
· Analyze and determine the impact of technology capabilities and limitations, translate the high level design to detailed design specifications, develop, test and deploy code/application in various environments.
· Effectively communicate with various teams and stakeholders, escalate technical and managerial issues at the right time and resolve conflicts.
· Perform effort estimation for various Data Warehouse activities; meet project deliverables as per requirements, on time within budgets.
· Demonstrate passion for quality and productivity by use of efficient development techniques, standards and guidelines.

Essential skills (must have):

· A Bachelor’s degree with a minimum of 4 years of related recent industry experience or Masters degree in Master of Science in Computer Science.
· Good knowledge and hands-on experience with Java, ETL tools
· Experience with Database technologies (Ex: Oracle, Netezza, MySQL)
· Knowledge of ANSI SQL
· Knowledge and experience of working with large scale databases
· Knowledge and experience of Unix (Linux) Platforms
· Understanding of “Big Data” Technologies (Hadoop, MR, Hive, HBase, Pig)
· Effective analytical, troubleshooting and problem-solving skills
· Strong customer focus, ownership, urgency and drive.

Desirable skills (nice to have/optional):

· Open Source and Commercial off the Shelf ETL Tools (Ex: Kettle )
· Knowledge of NoSQL databases (MongoDB, CouchDB)
· Knowledge of columnar databases
· Knowledge of development lifecycle process/Agile development
· Knowledge of Quality Assurance (QA) practices and programs

Job Type: Contract

Experience:

  • java, hadoop, ETL, Hive, spark: 5 years (Required)

Work authorization:

  • United States (Required)

Contract Renewal:

  • Likely

Other Java contracts

Remote
0
USD
/hr

0 outside IR35 Java contracts