Remote
0
USD
/hr

17 Hadoop contracts

Hadoop Developer

9 days ago
RemoteDiverse Lynx
Role Hadoop Developer
Remote work allowed for duration of assignment (Yes/No) Yes till Covid
Exact Job Location/Studio/Work Address Blue Bell, PA/ Hartford, CT
Duration 12+ Months Contract
Roles/Responsibilities (5 8 day to day candidate s responsibilities) Position Responsibilities
  • Good understanding of Hadoop eco system, Yarn architecture
  • Writing high performance Hive queries
  • Good experience in Unix and shell scripting
  • Hands on experience in Spark with Python, Scala
  • Hands on with loading and manipulating large data sets using Spark
  • SQL ,Hive knowledge on debugging and troubleshooting Hadoop jobs
  • Prepare implementation plan as per the need and build the in scope applications in Big Data technologies
  • Responsible for all technical deliveries of the project
  • Manage data related requests, analyze issues and provide efficient resolution. Design all program specifications and perform required tests.
  • Prepare codes for all modules according to require specification.
  • Monitor all production issues and inquiries and provide efficient resolution.
  • Evaluate all functional requirements and map documents and perform troubleshoot on all development processes.
  • Collaborate with application groups to prepare effective solutions for all programs.
  • Documents all technical specifications and associate project deliverables.
  • Design all test cases to provide support to all systems and perform unit tests.
  • Good understanding of Agile, DevOps methodology
  • Good communication and client interfacing skill
  • Should have worked in offshore delivery model


Position Qualifications
  • Must have 7+ years of experience with Hadoop ecosystem HDFS Hive Sqoop and SPARK
  • Bachelor s Degree in Computer Science, Engineering or other STEM programs preferred
  • Should have experience in developing framework and ETL process using HIVE SCALA Spark
  • Experience in designing building and managing datamarts like processing in HIVE Spark
  • Experience in Python programming
  • Good experience in Unix and shell scripting
  • Excellent analytical and technical skills with experience working in onsite, offshore model
Required Qualifications (5 8 bullet points on must have skills) Apache Hadoop , Hive , Map Reduce , Spark






Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.



Get new remote Hadoop contracts sent to you every week.
Subscribed to weekly Hadoop alerts! 🎉

Data Engineer (100% remote)

10 days ago
$50 - $57/hourRemoteInnovyt
Integration of reporting tools to various databases including SnowFlake, SQL Server, Oracle and Hadoop with an additive focus on user credential/database…

Python Developer

21 days ago
£300 - £350/dayRemoteOutside IR35Avanti

Avanti are recruiting for a Python Developer with experience in Spark, Scala & PySpark. The role is a 6-12 month contract (most likely 12months) outside of IR35 and fully remote.

7yrs+ development experience

Extensive experience in Spark, Scala, PySpark, Python

Experience in Oracle, Unix

Hadoop, Tableau, JIRA, BitBucket, SonarQube, TeamCity, uDeploy, ServiceNow

Experience in Agile projects

Good understanding of customer data & Payments

Understanding of Transaction banking domain

To be considered please click APPLY NOW below

AWS Engineer - SC Clearance

3 days ago
RemoteAdSwipe
AWS Engineer - Remote - Contract - SC Clearance

Our client is a large IT consulting company currently looking for an AWS Engineer with experience in building CI/CD pipelines to provide technical expertise and hands-on day to day support on a number of end client projects.

Key elements of this role include:

Knowledge of infrastructure as code
Proven networking experience (familiar with routing, peering, DNS)
Service Discovery
Monitoring / Logging
Familiar with PKI
Proven experience building CI/CD pipelines

Essential Skills and Experience:

Amazon ECS/EC2/ EMR/ Route53
Python / Java
Prometheus
Docker / Containerisation
Terraform
Hadoop / SparkThe role is fully remote with an initial contract length of 2 months. Successful candidates will have SC Clearance or be eligible to apply.

If you would like to find out more about this opportunity please send your CV to (url removed)

AWS ENGINEER |SC CLEARANCE| REMOTE CONTRACT | CI/CD PIPELINES| TERRAFORM
Role: AWS Engineer - SC Clearance
Job Type: Contract
Location: London, London,

Apply for this job now.

AWS Engineer - SC Clearance

9 days ago
RemoteTechnojobs

AWS Engineer - Remote - Contract - SC Clearance

Our client is a large IT consulting company currently looking for an AWS Engineer with experience in building CI/CD pipelines to provide technical expertise and hands-on day to day support on a number of end client projects.

Key elements of this role include:

  • Knowledge of infrastructure as code
  • Proven networking experience (familiar with routing, peering, DNS)
  • Service Discovery
  • Monitoring / Logging
  • Familiar with PKI
  • Proven experience building CI/CD pipelines
  • Essential Skills and Experience:

  • Amazon ECS/EC2/ EMR/ Route53
  • Python / Java
  • Prometheus
  • Docker / Containerisation
  • Terraform
  • Hadoop / Spark
  • The role is fully remote with an initial contract length of 2 months. Successful candidates will have SC Clearance or be eligible to apply.

    If you would like to find out more about this opportunity please send your CV to Click here to contact this recruiter

    AWS ENGINEER |SC CLEARANCE| REMOTE CONTRACT | CI/CD PIPELINES| TERRAFORM

    Contact Name: Celesta Ocansey
    Reference: TJ/3086/0102AWSE21CO_1612170223
    Job ID: 2893161

    AWS ENGINEER

    20 days ago
    Remotejobswipe
    AWS Engineer | Remote Work | Contract

    My client is an IT consultancy firm, currently working on a government programme and are seeking a SC cleared (or eligible to go through the process) AWS Engineer to help with a number of projects on a contractual basis.

    My client is looking for an AWS Engineer who :

    has a good understanding of infrastructure as code
    is experienced with networking and familiar with routing, peering, DNS
    can establish a service discovery protocol
    is competent with monitoring / logging
    is familiar with PKI
    has experience building CI/CD pipelines
    The specific technologies / products required are :

    Amazon ECS/EC2/EMR/Route53
    Python / Java (If we go down the route of writing a custom encryption material provider for EMR Java will be necessary)
    Prometheus
    Docker / Containerisation
    Terraform
    Hadoop / Spark

    Initially, the contract length will be 2 months with a highly lucrative per day rate to the successful candidate. There is scope for further work due to the number of AWS projects my client is consulting on.

    AWS Engineer | Remote Work | Contract

    If this is you, or you know anyone who is looking for a new contract role, please email : or apply below
    Role: AWS ENGINEER
    Job Type: Contract
    Location: City of London, City and County of the City of London,

    Apply for this job now.

    AWS Engineer - SC Clearance

    20 days ago
    Remotejobswipe
    AWS Engineer - Remote - Contract - SC Clearance

    Our client is a large IT consulting company currently looking for an AWS Engineer with experience in building CI/CD pipelines to provide technical expertise and hands-on day to day support on a number of end client projects.

    Key elements of this role include:

    Knowledge of infrastructure as code
    Proven networking experience (familiar with routing, peering, DNS)
    Service Discovery
    Monitoring / Logging
    Familiar with PKI
    Proven experience building CI/CD pipelines

    Essential Skills and Experience:

    Amazon ECS/EC2/ EMR/ Route53
    Python / Java
    Prometheus
    Docker / Containerisation
    Terraform
    Hadoop / SparkThe role is fully remote with an initial contract length of 2 months. Successful candidates will have SC Clearance or be eligible to apply.

    If you would like to find out more about this opportunity please send your CV to (url removed)

    AWS ENGINEER |SC CLEARANCE| REMOTE CONTRACT | CI/CD PIPELINES| TERRAFORM
    Role: AWS Engineer - SC Clearance
    Job Type: Contract
    Location: London, London,

    Apply for this job now.

    Spotfire Developer

    10 days ago
    RemoteNext Ventures
  • Practice Development & Integration

  • Technologies Design Development Skills

  • Spotfire Developer – Pharmaceuticals – Contract – 12 months- Remote working – India/Bangalore

    Job Title: Tibco Spotfire Developer

    Next Ventures seeks 2 x Skilled Spotfire Developers to complete a long term project with a European client working on a Project Delivered from Bangalore.

    Roles & Responsibilities:

    The candidate will be a critical development and support resource using Tibco Spotfire to deliver integrated reporting and analytic services. Candidate will also help train end users, core Internal & external development team, and support build visualizations using Spotfire, Python, SAS and R.

    •Work with Analytic, Account, and customer stakeholders to define and assimilate requirements to design and architect client Spotfire reporting and measurement solution(s). Create and support application of Spotfire development best practices and in troubleshooting complex challenges.
    •Develop re-usable medium-complex reports and visualizations, re-usable data sets for reporting, and analytic integrations working with customer, internal teams, and analytic/data scientists where required.
    •Lead training of users of varying roles on use of Tibco Spotfire and reporting solution developed. Assist with development of re-usable client training materials to be used in training sessions.
    •Work with internal technology teams to optimize Spotfire and Big data environments to support the reporting and analytics solutions.

    Qualification, Skills & Experience:

    •Bachelor’s Degree in Informatics, Information Systems, Engineering or related field with 6+ years’ relevant experience
    •Proven specialist knowledge of Pharma development process e.g. Regulatory Affairs, Bio Statistics, Drug Safety, Clinical Operations
    •Extensive knowledge on technologies and trends related to Research and Development including big data, advanced analytics and Internet of Things
    •Prior experience in completing a full computerized systems validation and testing methodology with awareness of the risks, issues, complications, and activities involved in these processes
    •Versed in TIBCO Spotfire best practices and able to incorporate and influence their use in reporting solutions.
    •Advanced Oracle SQL, Data Visualization, Modelling and Wrangling skills using SpotFire are required.
    •Strong Customer Facing skills and experienced in training end users of all levels in use of Tibco Spotfire.
    •Prior experience integrating R analytics, python/CSS/HTML, JavaScript and developing integrated data solutions to support high performance reporting and analytics.
    •Experienced working in iterative and agile reporting environments.
    •Versed in Tibco Spotfire Server Administration, configuration, and troubleshooting.
    •Leading, influencing, and consulting skills.
    •Advanced SQL and Data Blending Skills preferred

    Expert level Tibco Spotfire report and dashboard development experience in biotech industry with exposure using Informatica, Hadoop and other related big data technologies.

    Excellent written and verbal communication skills; able to communicate effectively with internal and external contact persons.

    Excellent negotiation and conflict management skills. Analytical skills, quick perception and excellent judgment; able to identify risks and problems, to develop adequate problem-solving strategies even in complex situations, and to take appropriate measures when required. Strategic thinking beyond own function; is familiar with and considers overall business objectives and company strategy.

    Call or email for more info :

    Matt@next-ventures.com
    +442075494034


    Data Scientist - SQL/Python

    3 days ago
    RemoteAdSwipe
    We have remote working contract opportunity available for an experienced Data Scientist Consultant to join our global ecommerce client’s Global Tax Technology team.

    Your core functions
    • Support AI/ML model development for Tax technology
    • Monitor model performance and update/re-train as needed
    • Develop model success metrics, automate reports for monitoring them
    • Data Analysis, Visualization and Derive Insights
    • Maintain a close and pro-active relationship with Engineer team for successful deployment of the model

    Skills required

    • Excellent technical skills for Machine learning and process development, especially SQL and Python
    • Experienced in handling big data (Hadoop, Teradata, etc.)
    • Experience in ML model development and implementation
    • Proficient in Advanced Excel
    • Experience with Tableau and other Reporting dashboards
    • Agile work experience
    • Ability to translate commercial requirements into software solutions
    • Capable to work independently
    Basic Qualifications:
    • 8-10 years’ hands-on experience in data analytics
    • Proven knowledge in SQL coding and Python programming

    Some details on the day-day work:
    1. Monitoring performance of live models (i.e. models already live on the site)
    a. Excellent reporting skills, build automated reports that can be refreshed ad-hoc or scheduled
    b. Identify the key metrics needed for model monitoring
    c. Check if there is variation in accuracy and other key metrics, and find impact /root cause if accuracy is dropping
    d. identify next steps and communicate to business
    2. Regular update of existing models using recent data, or re-training when there are business changes
    3. Build new model for new countries
    a. Data preparation
    b. Category mappings done with the help from BU (but this person to initiate and co-ordinate this)
    c. Data upload to Vertex server (cloud environment. No separate skills or Azure knowledge required for this part. This can be picked up easily)
    d. Train/Test models using Lucy algorithm
    e. Important part – Model assessment, distinguish categories doing well vs not based on pre-decided metrics
    f. Create the report, next steps for model improvement where needed, and communicate well to stakeholders
    g. Model improvement could include – further data cleansing/ manipulation/ transformations, or even trying new techniques/algorithms for the low performing categories
    4. Co-ordinating with Engineering teams for getting the models into production.
    a. Need to be able to communicate well/explain the details and hand over models to Engineering team
    Also, ensure Engg. process of Datawarehouse updates etc. (on live model results) works for us as we need that data for Reporting
    Role: Data Scientist - SQL/Python
    Job Type: Contract
    Location: London, London,

    Apply for this job now.

    Machine Learning Data Scientist (SQL/Python)

    6 days ago
    £550 - £580/dayGreat rateRemoteIT Talent Solutions Ltd

    We have remote working contract opportunity available for an experienced Data Scientist Consultant to join our global ecommerce client’s Global Tax Technology team.


    Your core functions
    • Support AI/ML model development for Tax technology
    • Monitor model performance and update/re-train as needed
    • Develop model success metrics, automate reports for monitoring them
    • Data Analysis, Visualization and Derive Insights
    • Maintain a close and pro-active relationship with Engineer team for successful deployment of the model

    Skills required

    • Excellent technical skills for Machine learning and process development, especially SQL and Python
    • Experienced in handling big data (Hadoop, Teradata, etc.)
    • Experience in ML model development and implementation
    • Proficient in Advanced Excel
    • Experience with Tableau and other Reporting dashboards
    • Agile work experience
    • Ability to translate commercial requirements into software solutions
    • Capable to work independently
    Basic Qualifications:
    • 8-10 years’ hands-on experience in data analytics
    • Proven knowledge in SQL coding and Python programming

    Some details on the day-day work:
    1. Monitoring performance of live models (i.e. models already live on the site)
    a. Excellent reporting skills, build automated reports that can be refreshed ad-hoc or scheduled
    b. Identify the key metrics needed for model monitoring
    c. Check if there is variation in accuracy and other key metrics, and find impact /root cause if accuracy is dropping
    d. identify next steps and communicate to business
    2. Regular update of existing models using recent data, or re-training when there are business changes
    3. Build new model for new countries
    a. Data preparation
    b. Category mappings done with the help from BU (but this person to initiate and co-ordinate this)
    c. Data upload to Vertex server (cloud environment. No separate skills or Azure knowledge required for this part. This can be picked up easily)
    d. Train/Test models using Lucy algorithm
    e. Important part - Model assessment, distinguish categories doing well vs not based on pre-decided metrics
    f. Create the report, next steps for model improvement where needed, and communicate well to stakeholders
    g. Model improvement could include - further data cleansing/ manipulation/ transformations, or even trying new techniques/algorithms for the low performing categories
    4. Co-ordinating with Engineering teams for getting the models into production.
    a. Need to be able to communicate well/explain the details and hand over models to Engineering team
    Also, ensure Engg. process of Datawarehouse updates etc. (on live model results) works for us as we need that data for Reporting

    Big Data Engineer

    15 days ago
    $40 - $59/hourRemotebig data llc
    Job details
    Salary
    $40 - $59 an hour
    Job Type
    Full-time
    Contract
    Number of hires for this role
    2 to 4
    Qualifications
      • Spark: 2 years (Required)

      • US work authorization (Required)

      • Bachelor's (Preferred)

      • Data Warehouse: 2 years (Preferred)

      • java: 3 years (Preferred)

      • Python: 3 years (Preferred)

    Full Job Description

    At Big Data LLC, we work on data engineering platforms and distributed computing, on various challenging and interesting problems.

    What we’re looking for:

    You’re a talented, creative, and motivated engineer who loves developing powerful, stable, and intuitive apps – and you’re excited to work with a team of individuals with that same passion. You’ve accumulated years of experience, and you’re excited about taking your mastery of Big Data and Scala/Java to a new level. You enjoy challenging projects involving big data sets and are cool under pressure. You’re no stranger to fast-paced environments and agile development methodologies – in fact, you embrace them. With your strong analytical skills, your unwavering commitment to quality, your excellent technical skills, and your collaborative work ethic, you’ll do great things here at BigData.

    What you’ll do:

    As a Senior Big Data Engineer, you’ll be responsible for designing and building high performance, scalable data solutions that meet streaming and batch platforms. You’ll design, develop, and test robust, scalable data platform components.

    Skills, accomplishments, interests you should have:

    BS in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience

    3+ years core Java experience: building business logic layers and back-end systems for high-volume pipelines

    Experience with spark streaming and scala

    Current experience in Spark, Hadoop, MapReduce and HDFS, Cassandra / HBase, Scala

    Understanding of data flows, data architecture, ETL and processing of structured and unstructured data

    Current experience using Java development, SQL Database systems, and Apache products

    Experience with high-speed messaging frameworks and streaming (kafka, akka,reactive)

    Current experience developing and deploying applications to a public cloud (AWS, GCE, Azure)

    Experience with DevOps tools (GitHub, TravisCI, Jira) and methodologies (Lean, Agile, Scrum, Test Driven Development)

    Experience with data science and machine/deep learning a plus

    Ability to work quickly with an eye towards writing clean code that is efficient and reusable

    Ability to build prototypes for new features that will delight our users and are consistent with business goals

    Ability to iterate quickly in an agile development process

    Ability to learn new technologies and evaluate multiple technologies to solve a problem

    Good written English

    Strong work ethic and entrepreneurial spirit

    Nice to haves:

    Experience mentoring or acting in a lead capacity

    Job Types: Full-time, Contract

    Pay: $40.00 - $59.00 per hour

    Schedule:

    • Monday to Friday

    Education:

    • Bachelor's (Preferred)

    Experience:

    • Data Warehouse: 2 years (Preferred)
    • java: 3 years (Preferred)
    • Python: 3 years (Preferred)
    • Spark: 2 years (Required)

    Work Location:

    • Fully Remote

    Company's website:

    • big-datai.com

    COVID-19 Precaution(s):

    • Remote interview process
    • Virtual meetings

    Remote Full-Stack Developer (C#, Python, SQL)

    18 days ago
    $55 - $60/hourRemoteSparkfish
    Job details
    Salary
    $55 - $60 an hour
    Job Type
    Full-time
    Contract
    Number of hires for this role
    1
    Qualifications
      • Microsoft SQL Server: 4 years (Required)

      • C#: 4 years (Required)

      • Bachelor's (Preferred)

    Full Job Description

    Full-Stack Software Developer - C#, Python, SQL

    Sparkfish is seeking a hands-on, dedicated Full-Stack Software Developer - C#, Python, SQL - who is driven to push their technical skills and knowledge; someone who is not looking for a typical desk job. The closer you are to being a full-stack developer, the better. We also need you to have cloud platform experience, preferably with Azure. We don't expect that you are the master of all things Azure or C#; however, we need you to have the desire and confidence to figure it out, mostly by your lonesome.

    Someone who enjoys being involved in multiple layers of the application stack. If you had to choose, you’d rather be coding. But ... your experience has led you to realize that SQL Server performance and maintenance is key. So, you’ve mastered what you need to know to keep the SQL side of things humming along.

    We work with different clients who serve a variety of industries, so one month you might be helping to migrate a company’s systems to the cloud, the next month you might be developing a mobile app, and still the next month you may be developing a web application. There is a ton of opportunity for a variety of innovative projects!

    We're a small company, but we've landed gigs serving some big brands. In general, our company is on two tracks. We run a consulting company that helps other companies solve their problems, whatever those might be. And, we are building a war chest to self-fund a series of startups. If you like a startup environment, then this might be a good fit.

    Please send your resume and ask questions. We will provide feedback within a few days.
    Benefits

    • Work from anywhere (work hours should roughly align with rest of US-based team)
    • Paid vacation (after 6 months)
    • Laptop allowance (after 6 months)
    • Work with other good people

    Skills Needed

    • Strong C# proficiency and moderate SQL skills
    • Moderate Python programming skills
    • Experience deploying code to at least one cloud provider, preferably with Azure
    • Experience working with and developing REST API and Web API architecture
    • Some level of “full-stack” experience including JS, CSS, HTML (with the emphasis on JS)
    • Source control, git, GitHub
    • Exposure to unit testing, test-driven development
    • Knowledge of client/server application development and Agile and Kanban methodologies

    Keywords: C#, node.js, azure, git, github, visual studio, rest api, web api, mongodb, sql, sql toolbelt, trello, slack, dapper

    Requirements

    Must be a good person. Humble, yet confident in your abilities, and able to work through both merge conflicts and people conflicts.

    Sure, it'd be great if you have all the enterprise-grade training in every possible technology already. But, all we really need is a smart person with just enough of an experience match, as long as you also have the grit and ability to google until you figure out whatever problem is in front of you.

    Interview Questions

    Answer as many of these as you like and post them in a gist to share your responses with us to review … this really saves everyone a LOT of time:

    • What are examples of the types of projects you have deployed to Azure?
    • How do you classify your senior-ness as a developer? What are your development strengths?
    • What are your top 1-2 favorite editor(s)?
    • Have you ever done a pull request?
    • Have you ever had a chance to use TDD? Have you found it to be helpful? If so, when does it work best for you?
    • Have you been given the opportunity to use tools like ReSharper? What did you like about it?
    • Do you prefer working directly with the business people? Or do you find it's best to have a go-between run interference?
    • Have you ever used web.config transformations? If not, do you know what they are good for?
    • Can you tell me anything about 'slots' in terms of Azure? If not, based on what you just googled, how would you use them in your next Azure project?
    • Have you worked with any of the cloud providers (Heroku, AWS, Azure, Google)?
    • Can you explain what DevOps and/or continuous integration is, and some of the benefits?
    • Are you technology agnostic? Or are you pretty strict when it comes to things like Mac vs Windows, or AWS vs Azure?
    • What are some of your more advanced SQL querying abilities (i.e., like what keywords or commands, etc)?
    • Do you happen to know what TypeScript is and why it’s gotten popular? How does it differ from JS?
    • How comfortable are you in working a project that involves touching some CSS and HTML?
    • Can you explain the diff b/t relational databases and NoSQL? What are some NoSQL platforms that you have familiarity with?
    • Can you explain GPG, SSL or public/private key encryption?
    • Do you know anything about "secure coding" practices?
    • Can _you_ authenticate to GitHub using SSH?
    • Can you explain what REST sorta kinda means?
    • How enthusiastic would you be if we asked to pay you to take some training courses on Udemy?
    • How many of these platforms / languages / frameworks do you have experience with?

    Node Typescript Angular / React / Vue Spark Hadoop SQL Excel Python Pandas R Linux Mac Windows Jasmine Docker Kubernetes .NET C# NUnit Go PostgreSQL MySQL SQL Server bash shell scripts PowerShell MongoDB Azure AWS Google Cloud Platform

    Job Types: Full-time, Contract

    Pay: $55.00 - $60.00 per hour

    Schedule:

    • Monday to Friday

    Education:

    • Bachelor's (Preferred)

    Experience:

    • Microsoft SQL Server: 4 years (Required)
    • C#: 4 years (Required)

    Contract Length:

    • 3 - 4 months

    Full Time Opportunity:

    • Yes

    Work Location:

    • Fully Remote

    Company's website:

    • www.sparkfish.com

    Benefit Conditions:

    • Only full-time employees eligible

    COVID-19 Precaution(s):

    • Remote interview process
    • Virtual meetings

    Data Engineer (AWS & Big Data)

    3 days ago
    $500 - $700/dayRemoteVANRATH

    VANRATH is recruiting for a global fintech giant that is in search of an experienced Data Engineer (AWS & Big Data) for a 12m contract. The Data Engineer must be able to solve problems creatively, communicate effectively, and possess the ability to lead others to achieve the critical mission of the team.


    Experience with AWS Glue is essential.


    RESPONSIBILITIES

    • Applies expert knowledge of cloud technologies, java language, DBMS and middleware technologies in independently designing and developing key services.
    • Hands-on with detailed design and architecture plans for complex, large-scale efforts within a multi-cloud environment.
    • Assists with system design, working with the various teams to build fit-for-purpose platforms.
    • Utilises the expertise of the team to develop architecture through consensus and team approach.
    • Works with the enterprise architecture team, to gain an understanding of the evolving enterprise, to make efficient decisions on the application architecture, and priorities.
    • Participates in code reviews, proactively identifying and mitigating potential issues and defects.


    IDEAL PERSON

    • Bachelor’s degree (with honours) or equivalent/better strongly preferred, but substantial relevant experience could substitute
    • Experience in AWS pipeline and big data services
    • Experience in AWS Glue
    • Experience in python, Java, Linux,
    • Experience architecting enterprise software applications
    • Experience in developing and automating solutions directly related to Continuous Integration/ Continuous Delivery and infrastructure automation


    DESIRABLES

    • AWS Data Analytics or AWS Certified Big Data Specialty qualification
    • Experience coding in a story-driven, agile environment
    • Experience working in the Big Data space handling both real-time and batch
    • Experience in the Hadoop ecosystem using EMR, Map Reduce and/or Spark
    • Prior experience working in financial services/exchange space
    • Prior experience working with BDD methodologies and automated acceptance criteria
    • Prior experience using Confluence, JIRA, or other Atlassian tools


    REMUNERATION

    • £500-£700 a day
    • FULLY REMOTE (NI & GB Only)
    • 12m Secure Contract
    • First-time contractors welcome


    About Me:

    I have been recruiting in the Belfast IT market for the past 5 years. I have helped first-time contractors become career contractors, whilst ensuring the best, most secure, and innovative contracts are also available to those experienced contractors looking for their next challenge.

    At VANRATH we partner with you, providing up-to-date information on credible companies and roles that best match your skill-set or aspirations.


    For further information on this vacancy, or any other Contract IT job in Belfast or wider Northern Ireland, please apply via the link below or contact Orla Fitzsimons in the strictest confidence.

    Cloud Engineer/AWS/Azure

    11 days ago
    RemoteAce Technologies

    Azure - Cloud Platform Data Engineer ( Install BI tools PowerBI, Tableau, MicroStrategy) -

    Remote full time

    Open to convert to be an employee of the client (not required to convert)

    18-month initial contract with long term extension (Multi-Year)

    Will need to submit Candidate desired Salary for Conversion

    Required:

  • Install / Migrate / Maintain - BI tools (PowerBI, Tableau, MicroStrategy) in Azure
  • Azure, Terraform, Ansible, Bash, Python, Splunk
  • Only submit if the candidate has Azure in current job
  • More Details.

    This team currently supports PowerBI, Tablue, Microstrategy for On Prem solutions.

    They are in the process of selecting one or more cloud reporting vendors to use in the cloud.

    The initial task is to set up new BI solutions in Azure and migrate the onPrem solutions to in the Cloud

    Platform Engineer - required experience installing BI (Business Intelligence) and Analytic tools in the cloud.

    After the Install the team will perform maintenance, set up Alerting, set up Autoscaling, notification and more

    Team - Currently there are 4 people on the team - most senior team members will partner with Contractors.

    These tools are used by their Data Science for reporting.

    Scripting - to Automate the install and other jobs (Terraform - Ansible)

    Fully Remote is agreeable, must send Salary Requirements with submittal

    Full Job Details.

    ---------------------------------------------------------------------------------------------------------

    Opportunity Overview

    As part of the Data Management area, the Platform Engineer will be responsible for the analytic and reporting platforms. As a Platform Engineer you will be helping to configure application and platform architecture framework that will act as the foundational system layers for Edward Jones reporting and analytics. The position requires excellent communication skills and ability to articulate system designs and patterns to varying levels of leadership. This role will also bring you into contact with a diverse array of stakeholders ranging from IT to our analytics stakeholders across the firm.

    Responsibilities

    - Install and maintain optimal on-prem and cloud-based reporting and analytical systems…

    - Scripting skills such as Bash & Python, as well as experience with Terraform, Ansible

    - Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

    - Package and deploy software on Windows and Linux, both on-prem and in the Azure cloud.

    - Leverage workflow tools to maintain accurate status of assigned tasks (JIRA, etc.)

    - Configure and Integrate platforms with security protocols in support of SSO through Azure Active Directory via Kerberos/SAML.

    - Integration of reporting tools to various databases including SnowFlake, SQL Server, Oracle and Hadoop with an additive focus on user credential/database passthrough.

    - Monitor systems for performance using tools such as Wily, Grafana, and WireShark

    - Debug complex internal and external infrastructure problems or limitations

    - Experienced in IS processes and methodologies

    - Work with stakeholders including the business analytics teams and IS architecture teams to assist with systems-related technical issues and support their infrastructure needs.

    - Onboards new associates, and mentors associates on established processes and procedures.

    Education/Experience

    • 3+ years of experience in a technical role installing and supporting reporting platforms.

    Experience packaging, installing and configuring reporting solutions on-prem as well as in the cloud on Azure

    • Bachelor's degree in Computer Science, Information Systems or another applicable field is preferred

    • Advanced working knowledge of cloud-based infrastructure preferred

    • Experience working with ETL tools and understanding of their needs

    Interview Preference Live Digital Interview

    DevOps Engineer

    10 days ago
    RemoteBluehawk Consulting
    Job details
    Salary
    $94,999 - $95,000 a year
    Job Type
    Full-time
    Contract
    Number of hires for this role
    1
    Qualifications
      • DevOps: 1 year (Required)

      • SQL: 1 year (Required)

      • Bachelor's (Preferred)

      • C# or Java: 2 years (Preferred)

      • Azure or AWS: 1 year (Preferred)

    Full Job Description

    DevOps Engineer

    Humacao, Puerto Rico

    Bluehawk Consulting has a managed service team of consultants in the cyber security space who have been working in Humacao, PR for a global software development company for more than five years. Our team has steadily grown to 20+ team members today.

    We are adding to the team again, now we’re seeking a DevOps Engineer as we continue to evolve and enhance the practice. The team is continually creating advanced analytics tools to combat software piracy for which they’re utilizing business intelligence practices and big data analytics tools.

    In this role, you’ll employ a combination of operations and development experience to build and support tools that help us drop and integrate code rapidly in the cloud using CI/CD pipelines and containerization best practices.

    The experience we’re seeking for this role is any cloud-based engineering, especially with the Azure Cloud platform, however we’d be very curious about your AWS or GCP ecosystem background. We also think any development experience using C# or Java lends well to this role since we’re thinking about Infrastructure as Code.

    While we’re flexible about your platform specific experience, we do require you bring a passion to advance your technical skills, an aptitude for continual learning and a desire to dive deep into cyber security.

    Required Skills & Experience

    • 5-8 years of experience in a combination of enterprise level operations and/or development roles
    • 1+ years of any DevOps experience highly desired, but not required
    • Experience supporting enterprise cloud technologies, using Azure or AWS ecosystem is preferred, however we are curious about any experience supporting cloud you have
    • Experience with Scaled Agile/Scrum methodologies
    • Troubleshoot and resolve code problems;
    • Escalate issues as required to ensure a timely resolution

    Technology Experience

    • Any experience building and supporting CI/CD pipelines
    • Any experience with container and orchestration technologies: Docker or Kubernetes
    • Some experience coding with C# or Java and T-SQL is required
    • Backend experience will be focused on SQL Server (SSIS, SSRS, SSAS)
    • Experience on public cloud platform: Azure or AWS or GCP
    • Experience using one, or more, shell environments (BASH, sh, tcsh, ksh, etc.)
    • Experience with Linux script-based development environments and tools (PERL, Python, Ruby, PhP, Jenkins, etc.)
    • Understanding of virtualization concepts and virtual system administration
    • Understanding of container specific networking such as CNI
    • Experience testing enterprise scale software products
    • Any experience working with test automation frameworks such as Fitnesse, Sellenium or VSTS is a plus

    Preferred Experience

    • Any experience with any of the following is highly desired:
    • Development – Visual Studio, Azure Toolkit, 3rd party IDEs such as Eclipse or IntelliJ
    • Build and Test – Azure DevOps / Pipeline, Visual Studio App Center, Azure Test Labs or Jenkins
    • Deployment – Azure Resource Manager, Azure Automations, Terraform or Cloud Formation
    • Monitoring and Operations – Azure Monitor, Azure App Insights, Azure Analytics or CloudWatch
    • Big Data – Azure Data Lakes, Data Factory, Az SQL, Hadoop, Apache Hive or HBase

    Our Culture

    Bluehawk Consulting is a Kirkland, Washington-based management consulting firm helping Fortune 500 clients improve their performance, increase their efficiency and enhance organizational value while reducing costs.

    We are a team who love to accomplish great things together!

    We excel at hiring talented people who are driven to create solutions for our client’s most challenging projects. We employ experienced management consultants with passion to deliver outcome, yet our approach includes hiring for attitude, while training for skill.

    We Value:

    • Our diverse team, they make us great!
    • Consultative, client-focused and interactive approaches
    • Innovation, actualizing potential, and “can-do-ness”
    • Learning, listening and communicating openly with respect
    • Passion, energy, zeal - we have lots of it
    • Humor, lightness, flexibility, humor and humor

    We work hard to create a supportive environment, where employees enjoy a very competitive pay and benefits package, including an optional, full-benefits package that includes medical, dental, vision, 401K, holidays and paid-time-off (PTO). Bluehawk also provides training opportunities and rewards leadership, diligence and the desire to achieve more. Bluehawk Consulting is an equal opportunity employer, committed to workforce diversity.

    If this sounds like a place you'd like to learn more about, visit us at www.bluehawkconsulting.com

    Job Types: Full-time, Contract

    Pay: $94,999.00 - $95,000.00 per year

    Schedule:

    • 8 hour shift
    • Day shift
    • Monday to Friday

    COVID-19 considerations:
    Everything will be fully remote until it is safe to go back into the office.

    Ability to Commute/Relocate:

    • Humacao, PR (Required)

    Education:

    • Bachelor's (Preferred)

    Experience:

    • DevOps: 1 year (Required)
    • C# or Java: 2 years (Preferred)
    • Development based role: 3 years (Preferred)
    • Azure or AWS: 1 year (Preferred)
    • SQL: 1 year (Required)

    Contract Length:

    • More than 1 year

    Full Time Opportunity:

    • No

    Work Location:

    • One location

    Visa Sponsorship Potentially Available:

    • No: Not providing sponsorship for this job

    Company's website:

    • www.bluehawkconsulting.com

    Work Remotely:

    • Temporarily due to COVID-19

    COVID-19 Precaution(s):

    • Remote interview process
    • Virtual meetings

    Developer - C# and SQL Server

    1 month ago
    $55 - $65/hourRemoteSparkfish
    Job details
    Salary
    $55 - $65 an hour
    Job Type
    Full-time
    Contract
    Number of hires for this role
    1
    Qualifications
      • Bachelor's (Preferred)

    Full Job Description

    We are looking for someone who enjoys being involved in multiple layers of the application stack. If you had to choose, you’d rather be coding. But ... your experience has led you to realize that SQL Server performance and maintenance is key. So, you’ve mastered what you need to know to keep the SQL side of things humming along.

    We’re seeking a hands-on, dedicated C# and SQL developer who is driven to push their technical skills and knowledge; someone who is not looking for a typical desk job. The closer you are to being a full-stack developer, the better. We also need you to have cloud platform experience, preferably with Azure. We don't expect that you are the master of all things Azure, SQL or C#; however, we need you to have the desire and confidence to figure it out, mostly by your lonesome.

    We work with different clients who serve a variety of industries, so one month you might be helping to migrate a company’s systems to the cloud, the next month you might be developing a data warehouse, and still the next month you may be developing a web application. There is a ton of opportunity for a variety of innovative projects!

    We're a small company, but we've landed gigs serving some big brands. In general, our company is on two tracks. We run a consulting company that helps other companies solve their problems, whatever those might be. And, we are building a war chest to self-fund a series of startups. If you like a startup environment, then this might be a good fit.

    Please send your resume and ask questions. Feedback will be provided within a day or two.
    Benefits

    • Work from anywhere in the US, with additional perks for those in Dallas
    • Flexible paid vacation, nice health benefit packages
    • High-powered development laptops, large external monitors, standup desks
    • Work with other really good people

    Skills Needed

    • Strong C# proficiency and moderate-to-advanced SQL skills
    • Experience managing SQL Servers (backups, restores, performance tuning, SQL agent jobs)
    • Experience with an exercise from an online class on MongoDB that you didn’t quite complete
    • Experience deploying code to at least one cloud provider, preferably with Azure
    • Experience working with and developing a REST API architecture
    • Some level of “full-stack” experience including JS, CSS, HTML (with the emphasis on JS)
    • Source control, git, GitHub, Azure DevOps
    • Exposure to unit testing, test-driven development
    • Knowledge of development using Agile and Kanban methodologies
    • Ability to read peoples’ emotions (aka Emotional IQ, if you want to get fancy)

    Keywords: dotnet core, node.js, azure, git, github, visual studio, rest api, web api, mongodb, sql, sql toolbelt, trello, slack, dapper
    Requirements
    Must be a good person. Humble, yet confident in your abilities, and able to work through both merge conflicts and people conflicts.

    Interview Questions
    Answer as many of these as you like and post them in a gist on Github to share your responses with me to review … this really saves everyone a LOT of time:

    • What are examples of the types of projects you have deployed to Azure?
    • How do you classify your senior-ness as a developer? What are your development strengths?
    • Have you ever had a chance to use TDD? Have you found it to be helpful? If so, when does it work best for you?
    • Have you been given the opportunity to use tools like ReSharper? What did you like about it?
    • What about Red Gate's SQL Toolbelt or similar? If you’ve been given the opportunity to use it, did you find it valuable?
    • Can you explain what DevOps and/or continuous integration is, and some of the benefits?
    • Are you technology agnostic? Or are you pretty strict when it comes to things like Mac vs Windows, or AWS vs Azure?
    • What sort of administration tasks have you performed on SQL Server? What versions were you managing?
    • What are some of the differences you have found between Azure SQL and a traditional installation of SQL Server?
    • After you’ve added the obvious indexes to a table, what tools do you think of using to try to sort out why a query is not running as fast as you think it should?
    • [ADVANCED] If you’ve had a chance to be exposed to CTEs, how would you describe situations where they can be handy?
    • [ADVANCED] So, SQL “window functions” are pretty much on the advanced side of SQL writing, so don’t freak out if you’ve never heard or used them before. Can you attempt to explain a scenario when window functions can be used to solve a problem?
    • How much duress would be needed to pressure you to write a PowerShell script? (I ask because I personally refuse to write in PS, but sometimes it apparently is needed when managing a SQL Server.)
    • What’s the big idea with NoSQL? When is it a good idea to use that versus a traditional RDBMS?
    • You may have heard of ACID (or perhaps just “transactions”). But can you explain what BASE sorta kinda means? It’s a nerdy topic with its own theorem, so feel free to look this up and explain it in your own words.
    • Have you ever had a chance to use SQLite? What are some of the advantages over SQL Server? Or, maybe when is this a bad idea?
    • Python has gained a lot of ground due to the surge in data science. But, we are also finding it useful for easier-to-manage admin scripts and ETL processes compared to the horrid affair that is SSIS. Are you now or will you ever have interest in becoming a pythonista?
    • What problems does MongoDB solve as compared to SQL Server? If you are building out an API back-end, which of these 2 approaches offers a simpler coding path to interact with the DB?
    • Pick from one of these NP-hard questions and give your best-guess response: (A) How do you tell a customer they are wrong without offending them? Or (B) You just went 100 hours over budget ... How do you recover the relationship with your customer (and your boss)?
    • Do you have any interest in being exposed to data science projects? Statistically speaking, 95% of respondents say yes to this question, so if you are going to say yes, maybe tell me what interests you in this topic.
    • How comfortable are you in working on a project that involves touching some CSS and HTML?
    • Can you explain GPG, SSL or public/private key encryption?
    • Have you had the occasion to undergo any training on "secure coding" practices?
    • How enthusiastic would you be if we asked to pay you to take some training courses on Udemy?
    • Which of these platforms / languages / concepts / frameworks do you have experience with (so far)?

    Node Typescript Angular React Vue Spark Hadoop SQL Excel Python Pandas R Linux Mac Windows Jasmine Docker Kubernetes .NET C# NUnit Go PostgreSQL MySQL SQL Server SQLite bash PowerShell MongoDB Azure AWS Google Cloud Platform SQLCLR SQL Cursors CTEs SQL Window Functions Table Valued Parameters Ola Hallengren SSIS SSRS Power BI ETL Encryption AutoMapper SQLite

    Job Types: Full-time, Contract

    Pay: $55.00 - $65.00 per hour

    Schedule:

    • Monday to Friday

    Education:

    • Bachelor's (Preferred)

    Contract Length:

    • 3 - 4 months

    Full Time Opportunity:

    • Yes

    Work Location:

    • Fully Remote

    This Company Describes Its Culture as:

    • Detail-oriented -- quality and precision-focused
    • Innovative -- innovative and risk-taking
    • Outcome-oriented -- results-focused with strong performance culture

    Company's website:

    • www.sparkfish.com

    Work Remotely:

    • Yes

    COVID-19 Precaution(s):

    • Remote interview process
    • Virtual meetings

    Lead Microservices developer

    9 days ago
    RemoteRIT Solutions, Inc.

    Job Title: Lead Microservices Software Engineer (Node, AWS, Lambdas)
    Location: Fully Remote (work from anywhere for the full project)
    Job Type: Contract
    Visa: GC/USC/TN
    Duration: Long-term (on-going for years)


    Our client is looking for a Lead Microservices Software Engineer (Tech Lead) to help build, customize and support the media asset management applications used by a number of different business units within the company. These systems are used to orchestrate all media distribution workflows, domestically and internationally and form the core automation framework for a number of different business units.

    Responsibilities: Duties include, but are not limited to:
    • Work closely with product management and the lead architect to translate business requirements into scalable and highly available tools
    • Drive the adoption of new technologies including containers, clustering, cloud computing, serverless and API
    • Integrate various software platforms and external 3rd party systems through vendor APIs.
    • Work with vendor(s) on API implementation and troubleshooting, suggesting necessary features and identifying enhancement opportunities.
    • Implement Cloud and Microservice best practices while adhering to standard architecture patterns
    • Develop microservice applications to support the media asset management systems and implement them using appropriate technologies & frameworks
    • Follow Agile best practices and tools adopted by the team
    • Configure Dev, QA and Production environments with proper packages and dependencies to enable development, working closely with DevOps and QA teams
    • Participate when needed in 24x7 application support schedule; some overnight/off hours shift-schedule support will be required during on-air rollouts, emergencies, and special broadcast events.

    Required Skills & Experience:
    • Solid experience working within a Microservices setting including development experience
    • Must have experience with Amazon Web Services (AWS) including Lambda
    • Languages: Node.JS
    • Solid experience working with Unix/Linux environments for development, including package management and basic system administration.
    • Experience developing software against documented third-party APIs and working with vendors to identify and correct issues and drive enhancements.
    • Experience working with IT systems, with a solid understanding of network protocols and standards (e.g. DNS, TCP, HTTP, FTP, SSH).
    • Plusses/Nice to haves (not required): Java, Docker, GraphQL, Object Storage

    Below are a list of technologies the client uses, but the skills listed above are the key ones for this particular role:
    • Back End Skills: Microservices, REST, GraphQL, Node.js, JVM (Clojure, Scala, Java - Spring, Spring Boot, Hibernate, etc.), Database (Postgres, MySQL, etc.), TensorFlow, PyTorch, scikit-learn, NumPy, Golang
    • Data Streaming: Kafka, NiFi, Storm, Hadoop, Spring XD/Spring Batch
    • DevOps/Platforms: Amazon Web Services (S3, RDS, EC2, ECS, Lambda, SQS, SNS, DynamoDB, CloudFormation, etc), Terraform, Chef, Docker, Ansible.
    • Automation Testing: Selenium, Karma, Mocha, Jest, Cucumber, ATDD, Protractor, Automated Performance and Regression/Functional Testing
    • CI/CD: Jenkins, Artifactory, Nexus
    • Build Tools: maven, gradle, grunt, gulp, git, svn, npm, yarn, lein, boot, Xcode
    • Agile and Scrum methodologies for software development and project execution.
    • Familiarity with project tracking and collaboration tools such as JIRA and Confluence (Wikis)
    • Previous experience working with Cloud Native toolkits.