This job has expired.
PruTech Solutions
Expired

Cloud Spark Developer(Fully Remote)

Remote

Location restricted
This job is restricted to tax residents of , but we detected your IP as outside of the country. Please only apply if you are a tax resident.

Cloud Spark Developer

Remote Position

Job Description –

You will be responsible for design and development of highly available, scalable and distributed data platforms using frameworks to process high volume, high velocity and wide variety of structured and unstructured data. This is a client facing role and candidate is expected to have responsibility for solution design, solving key technical challenges and guiding customer with technical recommendations.

REQUIREMENTS:

  • Bachelor's or Master's degree in a technology related field (e.g. Engineering, Computer Science, etc.) required
  • 4+ years implementing Big data solutions in data analytics space and a minimum of 2 year of experience in developing applications in Cloud (AWS, Azure, Google Cloud)
  • AWS Preferred
  • Experience with Software installation, provisioning and administration
  • Extensive experience in Object Oriented Programming (Java, Scala, Python), data movement technologies (ETL/ELT), Messaging Technologies (ActiveMQ, Kafka), Relational and NoSQL databases (Cassandra, Spark, Elastic search, Graph database, Stream Processing (Flink, Storm), Java Spring, Web APIs and in-memory technologies
  • Strong knowledge of developing highly scalable distributed systems using Open source technologies
  • Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker)
  • Experience in Agile methodologies (Kanban and SCRUM)
  • Strong technical design and analysis skills
  • Ability to take care of ambiguity and work in fast paced environment
  • Deep experience supporting mission critical applications quickly
  • Superb communication skills, both through written and verbal channels
  • Excellent collaboration skills to work with multiple teams in the organization
  • Ability to understand and adapt to changing business priorities and technology advancements
  • Strong knowledge and technology trends in implementing of Big data ecosystem
  • Proven understating of data architecture patterns such as Lambda, Kappa, Event driven Architecture, Data as a Service, Microservice, etc.

Must Have:

  • AWS
  • Spark Programming using Scala / Java
  • Docker
  • Kubernetes
  • Access Management Concepts like – RBAC and ABAC
  • Linux experience including Shell Scripting

Other ActiveMQ contracts

Remote
0
USD
/hr

0 outside IR35 ActiveMQ contracts