Big Data Developer/Modeller
A great opportunity for an experienced to Big Data Developer/Modeller to join an established financial services organisation in Edinburgh with facilities to work regularly from home. A rolling contract, looking for someone to start ASAP.
Requirements:
In depth understanding of big data processing technologies -
Hadoop/Spark/Hive/MapReduce/Impala/Kafka etc.
Programming experience - Java, Python, SQL etc.
Provide project guidance/consultancy on limitations and capabilities, performance etc
Undertake reverse engineering of physical data models
Analyse data-related system integration issues and propose appropriate solutions
Undertake query performance analysis, provide guidance and feedback into data model change as necessary
Financial Services experience with an aligned standard logical model
Develop best practices in physical modelling
Analysis of logical data models and creation of appropriate physical data models
Structure data on HDFS for optimal performance and efficiency
Selection of optimal data structures for Hive/Impala processing including normalising/denormalising approach
Understanding of storage options on Hadoop including compression, file formats
Develop partitioning, bucketing and indexing strategy