Big Data contract jobs near you / remote

Big Data

Enterprise Solutions Architect F33294

4 hours ago
$65 - $75/hour (Estimated)Reston, VA 20190SyApps LLC

SyApps is a technology and management consulting services firm focused on providing solutions in strategy, processes, technology and management. As a diverse end-to-end IT and management solution provider, SyApps offers a range of technical expertise to help customers compete successfully in an ever-changing marketplace.

Please visit us at www.syapps.com.

Tax Terms: Contract


Specialized Knowledge & Skills

  • Ability to interact, build working relationship with stakeholders to develop an architecture strategy, processes and deliver IT assets to meet the business objectives
  • 6-8 years of experience with solution architecture and solution experience across multiple architecture domains such as business, data, integration, and application
  • 6-8 years of experience with data integration patterns and Integration of data and business architectures
  • 5+ years of experience in leading end-to-end business process design, business events derivation and associated data objects and entity modeling.
  • 5+ years of experience in driving the enterprise architecture strategy, methodology and standards that delivers enterprise data model, associated meta model, common business vocabulary and taxonomies
  • Hands on experience in applying advanced enterprise architecture/design concepts such as cloud computing, data lake and big data.
  • Exposure to large-scale/enterprise-level data centric projects is a must
  • Good understanding of NoSQL data stores, Hadoop, MDM, data modeling across relational, Star/Dimensional or other emerging approaches
  • In depth knowledge of Advanced Analytics concepts/data science – exposure to applying machine learning/statistical models on large/enterprise-grade data
  • Working knowledge/conceptual understanding of AWS/cloud based managed services such as S3, EMR, RedShift etc. is considered essential
  • Experience reviewing designs for robustness and quality.
  • Strong understanding of SOA concepts, design patterns, tools, techniques and best practices.
  • TOGAF or an equivalent architecture framework certification is preferred
  • AWS or Cloud based architecture/equivalent certification is preferred

Desired Skills

  • 4+ years of design and development (Enterprise-grade applications) experience using ETL tools, relational data stores
  • Excellent verbal and written communication skills
  • Ability to communicate across diverse audience of business analysts, architects, business users and technical project staff
  • Demonstrate ability to influence others and advocate point of view when appropriate
  • Ability to work in a teaming environment; strong organization and interpersonal skills

Location: Reston, VA

SyApps LLC is a Winner of the SmartCEO/Grant Thornton Future 50 Award for being recognized as one of 50 fastest growing companies in the Greater Washington Area.

We are proud of our diverse environment, Equal Opportunity Employer. SyApps is committed ta policy of equal employment opportunity. SyApps participates in E-Verify.

This Week

Get new remote Big Data contracts sent to you every week.
Subscribed to weekly Big Data alerts! 🎉 You can see it in your dashboard.

Talend Developer

2 days ago
$60 - $70/hour (Estimated)Reston, VATech Era Inc

Talend ETL Developer

Reston, VA

12 months plus contract

Atleast 7 years of experience in the architecture, design, and development of Data Warehousing solutions

Atleast 7 years of experience with databases such as Vertica, Teradata, Sybase, Oracle, or DB2

Good experience with ETL tools such as Informatica and Talend

Working experience in Agile, Scrum and Kanban

Experience working in a CI/CD environment; with Micro Services and Open Source tools such as Spring Boot, Jenkins, and Maven is a plus

Experience working in Cloud environments, AWS, Big data environments

Job Type: Contract

Experience:

  • Informatica: 3 years (Preferred)
  • Talend: 5 years (Required)

Location:

  • Reston, VA (Required)

Work authorization:

  • United States (Required)

Work Location:

  • One location

Big Data Developer

4 days ago
$60 - $65/hourAshburn, VARPD Systems

Responsibilities and Duties
1. Understanding business requirements & helping in assessing them with the development teams.
2. Creating high quality documentation supporting the design/coding tasks.
3. Participate in architecture / design discussions and Develop the ETL/ELT using PySpark and
SparkSql
4. Conduct code and design reviews and provide review feedback.
5. Identify areas of improvement in framework and processes and strive them to make better.
Key Skills
Desired Skill:   1. Airflow  2. Understanding Object oriented programming  3. Devops implementation knowledge  4. Git Commands  5. Python Sphinx, Pandas, SQL Alchemy , Mccabe, Unitest etc.. Modules
Required Experience and Qualifications
Qualifications:
1. At least 3 years working experience in a Big Data Environment
2. Knowledge on design and development best practices in datawarehouse environments
3. Experience developing large scale distributed computing systems
4. Knowledge of Hadoop ecosystem and its components – HBase, Pig, Hive, Sqoop, Flume, Oozie,
etc.
5. Experience of Pyspark & Spark SQL
6. Experience with integration of data from multiple data sources
7. Implement ETL process in Hadoop (Develop big data ETL jobs that ingest, integrate, and export
data.) Converting Teradata SQL to PySpark SQL.
8. Experience in Presto, Kafka, Nifi.

Job Type: Contract

Salary: $60.00 to $65.00 /hour

Experience:

  • Data Developer: 4 years (Preferred)

Last 90 Days

Hadoop Developer

12 days ago
Reston, VADAtec Solutions

Requirements:

5+ years with Java , Python/Scala programming languages

Advanced Level experience (3+ years ) in Hadoop, Yarn, HDFS, MapReduce, Sqoop, Oozie , Hive, Spark and other related Big Data technologies Experience tuning Hadoop/Spark parameters for optimal performance Experience with Big Data querying tools including Impala Advanced experience with SQL and at least one major RDBMS (Oracle, DB2).

Job Type: Contract

Full Stack Developer

13 days ago
$80 - $90/hourRemoteWhitney LLC

Our Client:

Revolutionizing the digital design product space for Capital Markets and Big Data. They believe the great design is fundamentally about problem solving. They create simple, intuitive experiences because they take the time to develop a deep understanding of complex problem spaces. The team is comprised of innovators, artists, experimenters, thinker-doers, builders & breakers.

The Mission: Reinventing the mortgage servicing experience with disruptive distributed ledger technology on the backend and modern, sophisticated front ends for originators, servicers and borrowers.

Who you are:

  • An experienced developer who works across both the front and back ends of teams as an integration specialist
  • Team Player who can work across multiple stakeholders in various time zones
  • Open to working on a consultant basis for this project but interested in building on this for opportunities in our client's pipeline both in NYC and Europe

Required Skills: -Full stack JavaScript (Node/Typescript for the back end & React for the front-end). -GraphQL, Python, Haskell and Docker

Compensation is market competitive. Remote work is permissible on occasion and when practical. 9 month contract with a consultancy firm that has a pipeline of opportunities.

Job Type: Contract

Salary: $80.00 to $90.00 /hour

Location:

  • New York, NY (Preferred)

Work authorization:

  • United States (Required)

Contract Length:

  • 7 - 11 months

SENIOR SCALA ENGINEER

14 days ago
£520 - £630/day (Estimated)RemoteIntec Select

Senior Scala Engineer

Scala, Python, Spark, Kafka, AWS

Overview:
Our client a leading digital distributor of household financial services, requires an experienced Senior Scala Engineer to join a talented team of Big Data Engineers, Subject Matter Experts and Thought-Leaders in this cloud-based Big Data business unit.

As a Senior Engineer in my client’s team you will be driving the development of quality solutions for our customers and be proactive in adopting and championing best practices.

You’ll make a real impact by taking an active role in the team’s agile practices, technical decision making and development, generating value and continuously striving to improve the quality and reliability of our data.

You will be working on complex, challenging and exciting enterprise-level Big Data solutions/programmes which are SUPER cutting-edge.

For this Senior Scala Engineer / Senior Scala Developer role, you must have:

Scala, Spark, Python, AWS

Strong grasp of Big Data, SQL and database technologies

Enthusiasm for agile and lean development

Exposure to operation of business critical data products

AWS, RedShift, Kinesis experience

Git, CI/CD, Test driven approach (TDD/BDD)

Package:

£100,00 package

15% bonus

Dress Down

Remote Working (usually on a Friday/Monday)

Flexible Working (very relaxed/not a clock watch company)

Private HealthCare

Scala, Python, Spark, Kafka, AWS

Please respond to this advert with an up to date version of your CV and the leading consultant will be in touch.

Senior Scala Engineer

DATA MODELLING / ARCHITECT

16 days ago
Reston, VADAtec

Responsibilities

· Creating and maintaining optimal data pipeline
architecture.
· Assemble large, complex data sets that meet
functional/non-functional business requirements.
· Identify , design, and implement internal process
improvements: automating manual processes. Optimizing data delivery,
re-designing code and infrastructure for greater scalability.
· Build re-usable data pipelines to ingest, standardize,
shape data from various zones in Hadoop data lake.
· Build analytic tools that utilize datapipeline to provide
actionable insights into customer acquisition, revenue management,
Digital and marketing areas for operational efficiency and KPI's.
· Design and build BI API's on established enterprise
Architecture patterns , for data sharing from various sources.
· Design and integrate data using big data tools - Spark,
Scala , Hive etc.
· Helping manage the library of all deployed Application
Programming Interface (API)s
· Supporting API documentation of classes, methods scenarios,
code, design rationales, and contracts.
· Design/build , maintain small set of highly flexible and
scalable models linked to client's specific business needs.

Required Qualifications:

· 5+ years experience in data engineering /data integrationrole.
· 5+ years Advanced working SQL knowledge and experience working with
relational databases, query authoring (SQL) as well as working
familiarity with a variety of databases.
· Experience building and optimizing 'big data' data
pipelines, architecture and data sets.
· Build programs/processes supporting data transformation,
data structures, metadata, dependency and workload management.
· Experience in Data warehouse , Data Mart ETL
implementations using big data technologies.
· Working knowledge of message queuing , stream processing
and scalable data stores.
· Experience with relational SQL and NoSQL databases, Graph
databases (preferred).
· Strong experience with object oriented programming - Java,
C++ , Scala (preferred)
· Experience with AWS cloud services: strom, spark-streaming
etc.
· Experience with API, web services design and development

Preferred Qualifications:

· Functional experience in hospitality
· End-to-end experience in building data flow process (from
ingestion to consumption layer)
· Solid working experience with surrounding and supporting
disciplines (data quality, data operations, BI/Data WH/Data Lake)
· Effective communicator, collaborator, influencer, solution
seeker across variety of opinions
· Self-starter, well organized, extremely detail-oriented and
an assertive team player, willing to take ownership of
responsibilities, and possess a high level of positive energy and
drive.
· Excellent time management and organizational skills
· Ability to manage multiple priorities, work well under
pressure and effectively handle concurrent demands to prioritize
responsibilities

Job Type: Contract

Big Data Engineer

17 days ago
Chantilly, VAGeneral Dynamics Mission Systems
Basic Qualifications
Bachelor's degree in software engineering or a related technical field is required (or equivalent experience), plus a minimum of 5 years of relevant experience; or Master's degree plus a minimum of 3 years of relevant experience. Agile experience preferred.

KEY SKILLS
  • Minimum three (3) years’ experience in designing, developing, building, and implementing Big Data solutions or developing automated solutions to solve complex problems, a thoughtful ability to solve problems could outweigh years of experience.
  • Ability to identify and implement a data solution strategy
  • Demonstrates intellectual curiosity in exploring new technologies and finding creative ways to solve data management problems
  • Experience developing solutions with Python/Javascript/PERL
  • Experience/knowledge of Spark, Impala, Hadoop, Streamsets, Kafka, Rest APIs
  • Experience in SQL-based and NoSQL-based technologies
  • Experience in Linux administration/troubleshooting
A TS/SCI security clearance with the ability to obtain a Polygraph is required at time of hire. Candidate must be able to obtain the Polygraph within a reasonable amount of time from date of hire. Applicants selected will be subject to a U.S. Government security investigation and must meet eligibility requirements for access to classified information. Due to the nature of work performed within our facilities, U.S. citizenship is required.
Responsibilities for this Position
General Dynamics Mission Systems (GDMS) is seeking motivated candidates to join our insider threat detection, systems integration team. Our mission oriented team is responsible for the design, testing, deployment, maintenance, operation, and evolution of the systems directly supporting the insider threat detection program of a large government customer in the United States Intelligence Community (USIC). GDMS has an immediate opening on the team for a motivated Big Data Engineer with a self-starter mindset who is up to date with the latest tools and techniques. The position will focus on the integration of new data management technologies and software performance tuning and troubleshooting. This is a challenging yet rewarding position that provides an opportunity to leverage cutting edge technologies in pursuit of a vital mission that protects people, sensitive information/technologies, and the national security posture of the USIC.

The majority of work will be performed in Chantilly, Virginia, which is located approximately 25 miles west of Washington D.C., near the Dulles International Airport. The selected Big Data Engineer will support a 6+ year contract that General Dynamics recently secured.

CORE RESPONSIBILITIES:
  • Assist in the development and delivering of large scale data pipelines
  • Develop and implement automated tests for data transformations and data migrations
  • Research and apply big data solution technologies to complex datasets; make recommendations to data science team on new technologies
#CJ3
#CB
Company Overview
General Dynamics Mission Systems (GDMS) engineers a diverse portfolio of high technology solutions, products and services that enable customers to successfully execute missions across all domains of operation. With a global team of 13,000+ top professionals, we partner with the best in industry to expand the bounds of innovation in the defense and scientific arenas. Given the nature of our work and who we are, we value trust, honesty, alignment and transparency. We offer highly competitive benefits and pride ourselves in being a great place to work with a shared sense of purpose. You will also enjoy a flexible work environment where contributions are recognized and rewarded. If who we are and what we do resonates with you, we invite you to join our high performance team!

Azure Big Data Developer

20 days ago
Reston, VADAtec Solutions

Essential Job Functions:

  • Production experience in large-scale SQL, NoSQL data infrastructures such as Cosmos DB, Cassandra, MongoDB, HBase, CouchDB, Apache Spark etc.
  • Application experience with SQL databases such as Azure SQL Data Warehouse, MS SQL, Oracle, PostgreSQL, etc.
  • Proficient understanding of code versioning tools {such as Git, CVS or SVN}
  • Strong debugging skills with the ability to reach out and work with peers to solve complex problems
  • Ability to quickly learn, adapt, and implement Open Source technologies.
  • Familiarity with continuous integration (DevOps)
  • Proven ability to design, implement and document high-quality code in a timely manner.
  • Excellent interpersonal and communication skills, both written and oral.

Educational Qualifications and Experience:

  • Role Specific Experience: 2+ years of experience in Big Data platform development.

Certification Requirements (desired):

Azure Designing and Implementing Big Data Analytics Solutions

Required Skills/Abilities:

  • Experience with NoSQL databases, such as HBase, Cassandra or MongoDB.
  • Proficient in designing efficient and robust ETL/ELT using Data Factory, workflows, schedulers, and event-based triggers.
  • 1+ experience with SQL databases (Oracle, MS SQL, PostgreSQL, etc.).
  • 1+ years of hands on experience with data lake implementations, core modernization and data ingestion.
  • 3+ years of Visual Studio C# or core Java.
  • Experience at least in one of the following programming languages: R, Scala, Python, Clojure, F#.
  • 1+ years of experience in Spark systems.
  • Good understanding of multi-temperature data management solution.
  • Practical knowledge in design patterns.
  • In depth knowledge of developing large distributed systems.
  • Good understanding of DevOps tools and automation framework.

Desired Skills/Abilities (not required but a plus):

  • Experience in designing and implementing scalable, distributed systems leveraging cloud computing technologies.
  • Experience with Data Integration on traditional and Hadoop environments.
  • Experience with Azure Time Series Insights.
  • Some knowledge of machine learning tools and libraries such as Tensor flow, Turi, H2O, Spark ML lib, and Carrot (R).
  • Understanding of AWS data storage and integration with Azure.
  • Some knowledge of graph database.

Job Type: Contract

Application / Solution Architect

20 days ago
Reston, VADAtec Solutions

6+ years of experience with solution architecture and solution experience across multiple architecture domains such as business, data, integration, and application
6+ years of experience with system Integration patterns and Integration of Data & Business architectures
5+ years of experience in driving the Enterprise Architecture strategy, methodology and standards that delivers enterprise data model, associated meta model, common business vocabulary and taxonomies
Expertise in AWS/Cloud based managed services such as S3, EMR, Glue, RedShift etc. is considered essential
Strong understanding of SOA concepts, Design Patterns, tools, techniques and best practices
Hands on experience in applying advanced Data Architecture/Design concepts such as Datawarehouse, Datamart, Data Virtualization, Data Lake, Big Data, Logical Datawarehouse
Hands on experience in applying advanced Enterprise Architecture/Design concepts such as Cloud Computing, Microservices, 12 Factor Application development
TOGAF or an equivalent architecture framework certification is preferred
AWS or Cloud based Architecture/equivalent certification is preferred

Desired Skills:
7+ years of Design & Development (Enterprise-grade applications) encompassing backend Design Patterns, APIs, Cloud Adoption, Security framework, ETL, Reporting and Relational data stores
Ability to communicate across diverse audience of business analysts, architects, business users, technical project staff, senior leadership across business and technology
Demonstrate ability to influence others and advocate point of view when appropriate

Qualification:

Ability to apply complex design patterns to defined problems
Ability to manage large development teams
Excellent knowledge in application of design techniques, methodologies, and languages

Job Type: Contract

Big Data Engineer

22 days ago
£600 - £650/dayRemoteHarnham

Big Data Engineer
£600-£650 per day
Initial 3 month contract
London/ Sweden

As a Big Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Big Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Big Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Big Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Big Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £650 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Architect

25 days ago
£600 - £700/dayRemoteHarnham

Big Data Architect
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Big Data Architect you will be introducing Kafka as a streaming technology for a financal client - from roadmapping to deployment.

THE COMPANY:

You will be working for a leading consultancy who specialise in Data Engineering, DevOps and Data Science. As a Big Data Architect you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Big Data Architect, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Big Data Architect you will be working in the Kafka roadmap, both working in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Big Data Architect, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable. Though this will be a client facing role you must be also be prepared to be hands on when neccessary and therefore experience with Scala and Java is desirable.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Architect will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Engineer

25 days ago
£600 - £700/dayRemoteHarnham

Big Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Big Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Big Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Big Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Big Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Big Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Onsite Consulting Engineer

26 days ago
RemoteIronBrick

IronBrick was established more than a decade ago with the vision to reduce the cost, risk, and effort of managing information. Today that vision is still strong, our innovative services and solutions transform IT environments by solving customer’s complex technology and business challenges. IronBrick demonstrates value by focusing on the highest priorities of our customers for IT modernization, from the data center to the cloud and from security to big data & analytics. Our list of federal and enterprise customers spans the globe, and we deliver results by leveraging long-standing partnerships with leading technology companies coupled with our high performing service practice.

Key Responsibilities:

  • Key player in onsite operations & management (O&M) engagement with Federal IT organization encompassing technologies such as NetApp, Commvault, VMware, Cisco UCS, SolarWinds, and Windows Server 2016
  • Perform FlexPod Express configuration and operations, backup management, data migrations, QA testing, and daily activities in the field for Federal customer
  • Assist customer with creation of documentation and scripts for configuration management, asset management, and standard operating procedures
  • Analyze and assess the impact of proposed changes to the environment prior to implementation
  • Work independently to further government initiatives
  • Work remotely with IronBrick Project Manager, Supervisor and/or Solution Architect, and lead operations of IronBrick technical architectures and custom consulting initiatives
  • Quickly adapt to changing scenery and logistics providing customer assistance and IronBrick knowledge to compliment customer’s knowledge base. Guide and teach customer for best solution available
  • Adhere to industry and IronBrick best practices for service delivery and assist in the ongoing development and documentation of the internal process and delivery guidelines
  • Manage customer acceptance of the solution and customer expectations within the scope of the engagement
  • Responsible for streamlined communications to Customer and IronBrick engagement team regarding the project status and progress

Basic Qualifications:

  • Bachelor’s degree in Information Technology or related field or the equivalent in experience;
  • Minimum SECRET clearance;
  • Ability to work out of customer site in Washington, DC with occasional local travel
  • Minimum two (2) years experience NetApp Clustered Data ONTAP, including backup and disaster recovery technical features such as SnapMirror, SnapVault, and SVM DR;
  • Minimum two (2) years experience with VMware vSphere
  • Minimum two (2) years experience with Windows Server operating system
  • Experience with CommVault Backup & Recovery solutions, including some implementation experience
  • Basic Layer 2 switching knowledge
  • Experience with Microsoft Active Directory and understanding of LDAP and Active Directory integration with COTS applications;
  • Understanding of infrastructure services including: DNS, DHCP, and SMTP;
  • Experiencing creating/modifying MS Visio diagrams, create as-built documentation in MS Word;
  • Experience with design, implementation, and support of multi-tiered application architecture;
  • Ability to develop documentation for delivery to customer;
  • Ability to handle critical information and sensitive situations;
  • Ability to use sound judgment to effectively solve problems;
  • Ability to work with minimal supervision and efficiently handle multiple projects with shifting priorities;
  • Excellent analytical/problem solving and troubleshooting skills in both structured and unstructured environments; and
  • Strong verbal and written communication, presentation and interpersonal skills.

Additional Qualifications:

  • Understanding of Cisco Nexus switches and features such as vPC
  • Familiarity with Cisco UCS C-Series servers, CIMC features, and VIC functionality
  • Familiarity with PowerShell scripting
  • Deployment and support experience for Orion SolarWinds (NPM, NTA, NCM, and LEM modules);
  • Understanding of Microsoft SQL server clustering, consisting of both default and names instances;
  • Ability to configure RAID and install an Operating System on a rack mount server
  • Background in Microsoft High Availability Clustering ; and
  • Industry Certifications
    • NetApp: NCDA
    • Cisco: CCNA
    • VMware: VCP

Why Work for IronBrick:

Our people are our greatest assets. We offer exciting and challenging career opportunities with a competitive and a comprehensive benefits package that includes the following:

  • Employer subsidized medical, dental and vision coverage;
  • Employer-paid life insurance, short-term and long-term disability;
  • Employer matching 401(K) with immediate vesting;
  • Highly competitive PTO allowance and holidays;
  • Maternity leave and Paternity leave;
  • Tuition Reimbursement Program.

Sr. Informatica Administrator

26 days ago
Reston, VAAlchemy Software Solutions

Any visa is fine for this role except OPT

Primary Skills:

  • Must have hands-on experience in development, engineering and operational product platform support in Informatica tools.
  • 8 + Years of strong hands on Informatica Powercenter and Data quality Administration (Versions 9.6 / 10.x). Solid experience in development, administration, architecting, engineering and setting up Informatica Environments.
  • 7 + Years of hands-on experience in configuration, development, performance benchmarks, monitoring, best practices, design patterns and use cases of Power center, Data Quality , Informatica Developer component and Data Virtualization,
  • 7 + years of hands-on experience in planning and upgrading large scale ETL(Informatica Powercenter / Data quality) environments and Enterprise Data Warehouse applications.
  • 7 + Years of hands-on Informatica process Automation experience using Perl and Unix shell scripting.
  • Experience with Informatica Intelligent Cloud service
  • 7 + Year of hands-on Oracle and SQL experience.
  • 5 + years of experience with XML, CSS and HTML.
  • 5 + years’ experience with Web service development.
  • Strong analytical skills and ability to troubleshoot and resolve problems.
  • Must have strong communication skills, both oral and written. Including ability to work with all management levels.
  • Must have proven ability to manage priorities and timelines.
  • Must have high integrity and accountability, willing to do what it takes to make the team and project for the enterprise successful.
  • Must be extremely responsive, able to work under pressure and with a strong sense of urgency.
  • Ability to share technical knowledge and clearly communicate technical concepts.
  • Strong leadership skills with ability to articulate ideas and issues in a team environment.
  • Should have AWS basic awareness, experience with Cloud offered ETL services is a plus
  • Hadoop, Big Data, Python, Ab Initio, Netezza is a plus.

Job Type: Contract

Sr Data Engineer Specilist

26 days ago
RemoteProactive Focus

Looking for experienced Sr Data Engineer Resources for one of the clients -partner based out in San Francisco, CA. For this position, the resource should have minimum 10+years of working experience and for qualified candidates, the client may allow working Remotely.

The candidate should have the following skills

Expert ar big data processing and building data pipelines in GCP.

Should be well versed with Java skills on Advanced level and Big Query on Intermediate level.

Should have work on Apache Beam, Dataflow, CData proc, PUB/Sub, Big query, cloud SQL.

Should have Tech lead experience and have knowledge and experience on creating own JIRA tickets.

Should be familiar and good to have the following skills

GC CLOUD, Google cloud, API, Big Tabel, stack driver,

Python- Intermediate level

spark and spring planning experience.

Job Types: Full-time, Contract

Experience:

  • spark: 1 year (Required)
  • Java: 10 years (Required)
  • Data Engineer: 10 years (Required)
  • Python: 10 years (Required)

Education:

  • Master's (Required)

Location:

  • San Francisco, CA (Required)

Work authorization:

  • United States (Required)

Work Location:

  • One location
  • Remote/Work from home

Full Time Opportunity:

  • No

IT - Applications Development Consultant III

1 month ago
$50 - $65/hourRemoteEssani International

Title: **Data Engineer

Location: **Franklin, Tennessee 37067

Duration: **6+ months Contract Role

Mode of Interview : **Webex/Skype

Cloud and Big Data related Projects

BIE- Analytics

What are the top 5-10 responsibilities for this position: **

  • Designing and building production data pipelines from ingestion to consumption within a big data architecture, using Azure Services, Python, Scala or other custom programming
  • Provide guidance around modern data warehouse design, implementation for migration from on premises to Azure
  • Perform detail assessments of current state data platform and create an appropriate transition path to Cloud technologies
  • Responsible for developing, and implementing Big Data platforms using Cloud Platform with structured and unstructured data sources.
  • Connecting and automating data sources, along with building visualizations.
  • Configuring, connecting, and setting up the infrastructure.
  • What software tools/skills are needed to perform these daily responsibilities?
  • Python, Scala, Azure , ELT, ETL**

What skills/attributes are a must have?**

  • 3 years of Data Warehousing and Big Data Tools and Technologies.
  • Demonstrated expertise with object oriented development languages (.Net, Java, etc.)preferred
  • 1+ Experience with Programming Languages such as Python ,Java, Scala
  • 2+ Advanced experience with both Relational and No SQL Databases
  • 3+ Advanced experience in Data Modeling, Data Structures, and Algorithms
  • Experience with Cloud Environments such as AWS OR Azure
  • Advanced knowledge in Linux & shell programming

What skills/attributes are nice to have?**

  • Strong analytical skills in problem solving, troubleshooting, and issue resolution
  • Ability to communicate effectively (oral and written) across multiple teams, facilitate meetings, and coordinate activities
  • Experience with Kafka Understanding of CI/CD tools and technologies
  • Experience in Data Sciences & Machine Learning is a plus
  • Healthcare industry experience
  • Where is the work to be performed? (Please list preferred UHG facility, if other please specify i.e. remote work, rural, etc.)

Job Type: Contract

Salary: $50.00 to $65.00 /hour

Experience:

  • Python ,Java, Scala: 1 year (Preferred)
  • .NET: 1 year (Preferred)
  • AWS: 1 year (Preferred)
  • Scala: 1 year (Preferred)
  • Machine Learning: 1 year (Preferred)
  • Cloud Environments such as AWS OR Azure: 1 year (Preferred)
  • Relational and NoSQL Databases: 2 years (Preferred)
  • Data Modeling, Data Structures, and Algorithms: 3 years (Preferred)
  • Java: 1 year (Preferred)
  • Data Sciences & Machine Learning: 1 year (Preferred)
  • Healthcare industry: 1 year (Preferred)
  • Kafka Understanding of CI/CD tools and technologies: 1 year (Preferred)

Work Location:

  • One location

Technical Lead/SW Developer

1 month ago
$55 - $70/hour (Estimated)Herndon, VAPraxis Engineering

What you will be doing

The applications we develop aid analysts in solving complex analytical problems. This is a unique opportunity for a developer to design and develop mission applications and lead small project teams. The candidate would have the opportunity to develop applications with varying degrees of scale, risk and complexity. Lead developer duties may include serving as the technical lead for the design, testing and implementation of complex applications, complex web application layouts, content and user interfaces and/or database projects.

Duties may include but not limited to:

  • The developer consults with clients to develop complex user requirements;
  • Translates user requirements from a formal requirements document into an application and/or database design;
  • Writes interfaces to companion applications or databases;
  • Writes necessary code;
  • Ensures interoperability with other applications;
  • Provides extensive technical advice and guidance to Application Developers and other members of the team;
  • Generates comprehensive test plans to ensure that adequate unit, system, and integration testing is performed;
  • Oversees unit/functional testing and transitioning of the complex application and/or database to production; and
  • Generates comprehensive test plans to ensure that adequate unit, system, and integration testing is performed.
  • Directs contract personnel and continually works to optimize resources to better meet Sponsor priorities.
  • Supports internal meetings (e.g. staff meetings, schedule reviews, status meetings), external stakeholder meetings, and working groups. Ensures that appropriate contract personnel attend project meetings where specific expertise is required.
  • Communicates regularly with Sponsor Leadership to provide an integrated view into project status, schedule, and performance.

What you will need

  • Minimum of 6 years demonstrated on-the-job experience with full life-cycle application software development
  • Minimum of 4 years specialized experience with hands on involvement in all aspects of the application software development life-cycle (from requirements analysis through design, programming, testing, and deployment)
  • Demonstrated on-the-job experience working with legacy application conversion (redeployment to alternative platforms and languages) and technology assessment
  • Demonstrated on-the-job experience with at least 2 of the following key technology areas:
    • Object-Oriented analysis, design and development using one of the following relational database technologies:
      • Oracle
      • MySQL
      • PostgreSQL
    • js application development with strong skills in HTML5, CSS, and Javascript ES6
    • Ruby on Rails
    • Javascript framework technologies, like Vue.js, with an emphasis on reusable components
  • Demonstrated on-the-job experience developing applications utilizing the model, view, control (MVC) architectural pattern
  • Demonstrated on-the-job experience with version control systems like Subversion or Git
  • Demonstrated on-the-job experience with the product life cycle to include the maintenance of production systems
  • Demonstrated experience leading a development team

Optional Skills:

  • Demonstrated on-the-job experience exhibiting customer service skills
  • Demonstrated on-the-job experience working in a team environment exhibiting problem solving skills
  • Demonstrated on-the-job experience with the Sponsor's Ruby on Rails platform
  • Demonstrated on-the-job experience with the Sponsor or Sponsor-s partners and their current technology issues
  • Demonstrated on-the-job experience exhibiting written skills to include technical documentation regarding Sun, Solaris and Linux
  • Demonstrated experience deploying applications to a cloud infrastructure using Amazon Web Services (AWS)
Clearance

TS/SCI with appropriate Polygraph

What it takes:
  • A 'be the best of the best’ attitude
  • Hunger to be at the leading edge of your field
  • Strong work ethic
  • Ability to work effectively in individual and team environments
  • Intellectual curiosity along with an analytical mind
  • Patriotism:
    • willingness to work towards the betterment of your countrymen
    • desire to be the 1stline of defense to protect your nation
    • passion to protect and serve

Who are we?

Praxis Engineering* was founded in 2002 and is headquartered in Annapolis Junction MD - with growing offices in Chantilly VA and Aberdeen MD.

Praxis Engineering is a consulting, product, and solutions firm dedicated to the practical application of software and system engineering technologies to solve complex problems.

With over 350 employees supporting more than 50 contracts, Praxis brings together world class engineers with proven engineering best practices, domain expertise, commercial technologies and proven agile management approaches to create high value solutions aimed at helping our customers meet their most critical business and mission objectives.

  • Praxis Engineering is a wholly owned subsidiary of General Dynamics IT.

Why Praxis?

We are focused on continual learning and evolution. We don’t do things because “that’s the way we’ve always done things”; we listen to our employees and adapt to the changing marketplace. We look at the big picture and encourage our engineers to get training and certifications in emerging technologies that will help shape our customer’s mission. We've been profitable year after year. We're always on the lookout for great engineers to join the team and we recognize that our employees are the heart and soul of what we do. We focus on recruiting talented people, treating them right, and then allowing them to do what they do best. No red tape. No micromanagement. Smart people want to work with smart people, and we love people who are passionate about what they do, and finding ways to do it better.

And then there is the...

Benefits

  • Competitive salary
  • Office perks such as free soft drinks and snacks (both healthy and not-so-healthy)
  • Praxis swag (annual gift certificate to purchase top brand Praxis apparel)
  • Comprehensive health insurance plan
  • 401(k) retirement plan with company paid contribution (amount equivalent to 7% of your salary)
  • Annual bonus plan
  • Four weeks Paid Time Off + 10 holidays + comp time eligibility
  • Training is a priority! Take advantage of our endless in-house training opportunities - or seek out vendor offered (paid) training opportunities like conferences, certification courses and seminars.
    • Conferences (recently attended by Praxis employees): AWS Summit, IoT World, Black Hat and DefCon.
    • Training & Certifications: Splunk, AWS, Big Data/Cloudera, VMWare, Scrum Master...the list of certifications goes on and on!
    • Praxis University: Cyber Research, Data Analytics, IoT, AWS and RedHat course offerings and hands-on training.
  • We truly believe the right work-life balance can exist, and it's here at Praxis. Our work is extremely important, but your job is just a part of who you are. When you enjoy your life outside of our walls, you're at your best the next time you walk through our doors. We do all we can to assure that happens every day.

Praxis Engineering provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, or any other protected class.

Kafka Data Engineer

1 month ago
£600 - £700/dayRemoteHarnham

Kafka Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Kafka Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Kafka Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Kafka Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Kafka Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Kafka Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Architect

1 month ago
£650 - £700/dayRemoteHarnham

Big Data Architect
£650-£700 per day
Initial 3 month contract
London/ Sweden

As a Big Data Architect you will be helping to create the Kafka architecture and outline the strategy for migration to the cloud!

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Big Data Architect you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside Data Engineers.

THE ROLE:

As a Big Data Architect, your main responsability will be creating the Kafka architecture from design to implementation.. Therefore it is imperative that you have extensive experience with Kafka for large implementations, ideally Confluent Kafka. As a Big Data Architect yit is essential you have a good understanding of technologies such as Spark and Hadoop as you will helping to implement these. You will be working in both an on premise environment as well as cloud environmentsand so it is valuable if you have worked in either AWS or Azure as a platform. Though you will heavily be involved in the planning and writing of roadmaps you must be prepared to be hands in and therefore previous experience programming in Scala/ Java is valuable. As you will be working for a consultancy it is essential that you are confident speaking with non technical people as this role will be very client facing.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Architect will have the following skills and experience:

  • Extensive experience implementing strategies using Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • Experience speaking to stakeholders
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Data Engineer

1 month ago
£600 - £700/dayRemoteHarnham

Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Big Data Architect

1 month ago
£650 - £700/dayRemoteHarnham

Big Data Architect
£650-£700 per day
Initial 3 month contract
London/ Sweden

As a Big Data Architect you will be helping to create the Kafka architecture and outline the strategy for migration to the cloud!

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Big Data Architect you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside Data Engineers.

THE ROLE:

As a Big Data Architect, your main responsability will be creating the Kafka architecture from design to implementation.. Therefore it is imperative that you have extensive experience with Kafka for large implementations, ideally Confluent Kafka. As a Big Data Architect yit is essential you have a good understanding of technologies such as Spark and Hadoop as you will helping to implement these. You will be working in both an on premise environment as well as cloud environmentsand so it is valuable if you have worked in either AWS or Azure as a platform. Though you will heavily be involved in the planning and writing of roadmaps you must be prepared to be hands in and therefore previous experience programming in Scala/ Java is valuable. As you will be working for a consultancy it is essential that you are confident speaking with non technical people as this role will be very client facing.

YOUR SKILLS AND EXPERIENCE:

The successful Big Data Architect will have the following skills and experience:

  • Extensive experience implementing strategies using Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • Experience speaking to stakeholders
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Data Engineer

1 month ago
£600 - £700/dayRemoteHarnham

Data Engineer
£600-£700 per day
Initial 3 month contract
London/ Sweden

As a Data Engineer you will be working heavily with Kafka for streaming purposes alongside Scala for programming

THE COMPANY:

You will be working for a leading consultancy who who specialise in Data Engineering, DevOps and Data Science. As a Data Engineer you will be assisting a Scandinavian client in helping them introduce Kafka and therefore your time will be split 50/50 between being based in Sweden and London - with the option to work remotely. You will be working in an agile environment alongside a Data Architect and other Data Engineers.

THE ROLE:

As a Data Engineer, you will be heavily involved in introducing Kafka as a technology. Therefore it is imperative that you have extensive experience with Kafka, ideally Confluent Kafka. As a Data Engineer you will be coding in Scala primarily and some Java and will be working both in an on premise environment as well as cloud environments. It is most valuable if you have have good exposure to either AWS or Azure as a platform. Though Kafka will be the main technology you will be focusing on introducing as a Data Engineer, experience with Spark and exposure to big data platforms such as Hadoop is highly valuable.

YOUR SKILLS AND EXPERIENCE:

The successful Data Engineer will have the following skills and experience:

  • Extensive experience with Kafka
  • Good knowledge of either AWS or Azure
  • Good experience coding in Scala and Java
  • Previous experience using Big Data technologies such as Spark/ Hadoop
  • The ideal candidate will also have commercial experience in Confluent Kafka

THE BENEFITS:

  • A competitive day rate of up to £700 per day

HOW TO APPLY:

Please register your interest by sending your CV to Anna Greenhill via the Apply link on this page.

Graphic Designer / Web-designer (Junior)

1 month ago
RemoteYolap Ltd
Hi artists!

I am looking for a Graphic Design / Web-design freelancer with good acknowledge of UI/UX to join my organisation as freelancer collaborator or as a Part-time intern to work on project upon request.

What is about? Let me explain...

I am an entrepreneur working on various digital projects and I often collaborate with freelancers in order to deliver: websites, branding materials, advertisement campaigns... My clients are anybody... Either a start-up based in Old Street, a South Korean corporation or a French Big Data company.

I am currently dealing with potential (big) clients and there will be good opportunities coming soon to work on CRM platform. Plus many other small projects (do you fancy design a cosmetic brand for a French entrepreneur?)

Your profile?

  • someone willing to adapt, commit and work at any moment (evening after workdays, weekends)
  • someone interesting to break into the start-up world.. combination of artist and entrepreneurship spirit
  • someone keen in wireframing... desktop, mobile
  • someone expert in the Adobe Suit (Illustrator, In-design, Photoshop)
  • someone who is not afraid to take risks

There is no seat at a cool hipster office in Shoreditch...

Yep sorry for that folks! My team and I work remotely and we occasionally catch up in co-working places or even in a cool pubs around East London. Most of my collaborators are based abroad (France, Balkans) thus we are meant to travel.

Benefits and compensations?

  • flexibility
  • pass to join our huge network
  • opportunity to work with international clients
  • commission on project
  • eligibility to get equity in the company

If that suits you please provide a short description of yourself (3 sentences) and your portfolio. No CV and no cover letter needed.

Cheers,