Big Data Engineer
Big Data, Hadoop, Java, Scala, Hbase, Kafka, Hive, ETL, Data Quality, Testing, Automation, Development, Design, Banking
An excellent opportunity has arisen for a Big Data Engineer to work for a Leading Banking client based in London.
As an Engineer on the Big Data platforms you'll be responsible for delivering quality software to tight timelines. You will be working in a dynamic global team with business analysts, project managers, business stakeholders and other technical experts across multiple locations in London and China.
Key skills/experience required:
- Understanding and knowledge of big data technologies with at least 2 years of experience working within a big data team
- Understanding and confidence testing in all or most of these:
- Full Hadoop Stack (e.g. Hive, HDFS, Hbase, Spark, Storm, Kafka, Oozie, Sqoop
- Configuration management tools - , e.g. Git, Gerrit
- Test Automation using Java and JavaScript
- CI tools Jenkins
- Ability to develop an automation framework around Continuous Integration using Behaviour Driven Development methodology.
- Experience of working in a feature team within an agile environment.
- A cross-skilled individual is desirable so they are able to occasionally help out the development team.
- Automation framework creation, In-sprint Automation, functional test automation, extensive Agile experience, Progressive Automation, Behaviour Driven Development (BDD), TDD, ATDD (Acceptance test driven development), Continuous Integration, Continuous Delivery, CICD, JBehave, Java/Selenium, Cucumber, Groovy, Ruby, Gherkin
- Participate in all agile ceremonies including code reviews, design, triage, estimating, and other testing and script development process
Please respond with an up to date CV for further information -
McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.