This job has expired.
Park Computer Systems Inc
Expired

Software Big data engineer

Dulles, VA (On-site)

Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.

W2 candidates only, local to Dulles, VA. in person is mandatory

The Software Engineer performs creating the design, development, testing, documentation, and analysis of software applications. Assignments include development of new programs and sub-programs, as well as enhancements, modifications, and corrections to existing software. Duties include designing applications, writing code, developing and testing and debugging and documenting work and results.

Responsibilities:
· Exercise good technical expertise and strong business skills, and lead the team in delivering data warehouse solutions.
· Warehouse Design - Work with the Architects to understand and implement a solid, extensible warehouse design that supports the new analytics requirements.
· Lead, participate in gathering business requirements, analysis of source systems, define underlying data sources and transformation requirements, design suitable data model, and develop metadata for the Data Warehouse/Analytical Reporting.
· Prepare Technical Design / Specifications; develop routines for data extraction and loading.
· Analyze and determine the impact of technology capabilities and limitations, translate high level design to detailed design specifications, develop, test and deploy code/application in various environments.
· Effectively communicate with various teams and stakeholders, escalate technical and managerial issues at the right time and resolve conflicts.
· Perform effort estimation for various Data Warehouse activities; meet project deliverables as per requirements, on time within budgets.
· Demonstrate passion for quality and productivity by use of efficient development techniques, standards and guidelines.

Essential skills (must have):
· A Bachelor’s degree with a minimum of 2 years of related recent industry experience or Masters degree in Master of Science in Computer Science.
· Good knowledge and hands-on experience with Hadoop, Java, ETL tools, SQL
· Experience with Database technologies (Ex: Oracle, Netezza, MySQL
· Knowledge and experience of working with large scale databases
· Knowledge and experience of Unix (Linux) Platforms
· Good knowledge andHands On Experience in “Big Data” Technologies (Ex: MR Hive, HBase, Pig, Spark)
· Effective analytical, troubleshooting and problem-solving skills
· Strong customer focus, ownership, urgency and drive.

Job Type: Contract

Experience:

  • java, hadoop, hive, spark, ETL, SQL, oracle, Mysql: 7 years (Preferred)

Education:

  • Master's (Preferred)

Location:

  • Dulles, VA (Required)

Work authorization:

  • United States (Required)

Contract Renewal:

  • Likely

Other Big Data contracts

Remote
0
USD
/hr

0 outside IR35 Big Data contracts