This job has expired.
Scadea
Expired

AWS Cloud Automation Engineer (Junior) (Remote)

Remote

Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.
Job title: AWS Cloud Automation Engineer (Junior)
Type: Contract / temporary
On-site/Remote: Remote (100 % remote. work even after Covid-19 resolution)
Location: Pittsburgh, Pennsylvania
# of vacancies: 5
Job industry: IT, Computers and High Tech
Start date: 01/25/2021
Job duration: 24 months (expected)
Work permits accepted: - US citizen - Green card - Green card-EAD - TN - H-1B - E-3 - H4-EAD - L2-EAD - OPT-EAD - CPT-EAD - TPS-EAD - STEM-EAD
W-2 employment at vendor: required
 
 
Requirements
Minimum education: Bachelor
Years of work experience: 4 year(s)
 
Responsibilities
Pay Rate:
Long Term, Indefinite, Total of 5 open positions.
Job Title
AWS Cloud Automation w/ Big Data
 
Description
  1. AWS Design, Architecture and Engineering experience in Data Services (EKS, ECS and other Big Data Services- should have experience building out these environments from scratch and doing the hands on scripting)
  2. Kubernetes- experience building out Kubernetes environments in AWS from scratch
  3. Ansible- Experience building complex playbooks, ideally for Big Data tools
  4. Experience writing Terraform Scripts is highly desired
  5. Experience with Kafka, Cloudera or Hadoop is a big plus
  6. ELK is a big plus but not required- Experience setting up the network and security aspects, Logs, etc.
  7. Experience working in a highly complex and regulated enterprise environment
 
Job Requirements:
• Architecture and engineering experience with Amazon Web Services (AWS) capabilities such as EKS and ECS (AWS Certifications a Plus)
• Working experience with one or more of the following technologies: Kubernetes, Docker, Terraform etc.
• Experience building data pipelines using languages such as Python, Java, Scala and tools like Logstash, Apache Beam, Kafka, Kafka Streams, KSQL, Apache Airflow
• Scripting: Ansible, PERL, PowerShell, Python, Shell
• Experience automating and orchestrating the provisioning of application and infrastructure services and capabilities.
• Capable of writing code to integrate and automation components through use of APIs.
• Strong working knowledge of quality assurance methodologies, testing techniques and approaches.
• Design, develop, modify, test and evaluate various Kubernetes and Docker implementations that support system requirements of varying levels for technical and business application projects
• Good knowledge of ELK stack Elasticsearch, Logstash, Kibana
• Experience with containers and container orchestration platforms like Ansible, OpenShift, Kubernetes
• Experience working with data and background of automating data ingestion, acquisition and transformation and making data available in the cloud or public cloud
• Experience with professional data engineering, building and using data infrastructure, APIs, and integrations
• Experience developing data pipelines, ETL jobs using languages like Python OR Java OR Scala
• Experience with ELK stack and data streaming using Kafka
• Strong knowledge of data structures, schemas and algorithms
• Healthcare experience is preferred but not required
 
Enterprise Req Skills
AWS,EKS,ECS,terraform,Ansible,kubernetes,big data
Remote conditions
100 % remote. work even after Covid-19 resolution
 
General skills
Must have
AWS 4 year(s) of experience
Kubernetes (EKS) 1 year(s) of experience
Containers (ECS) 1 year(s) of experience
 
Additional job information
this is a Junior /Mid level position with a client of ours here in Pittsburgh.
100% Remote work
 

Other Ansible contracts

Remote
0
USD
/hr

0 outside IR35 Ansible contracts