Bachelor's degree in a technical field such as computer science, computer engineering or related field required.
10+ years experience required.
Hands on experience in designing, developing and successful deployment of large scale projects from end-to-end.
Hands on experience in following the iterative and agile SDLC.
In-depth familiarity with Deep Learning, Machine Learning, Natural Language Processing, and Advanced Statistical Analysis
Experience with database technologies
Knowledge of the ETL process
Experience with data driven decision making
Big data analytics, environments, and tools such as Hadoop, Spark, and Hive.
Hands-on experience with data tools and programming languages including SQL & Python. Programming tools such as Python, and SQL for shaping data, model development, and visualization/dashboard deployment.
Experience with Talend, Graph DB etc..
Must be business-centric, data-oriented, analytically minded and results driven
Work with enterprise architect and business to proactively identify opportunities for advanced analytics utilization
Identify best statistical or modeling method to utilize
Review prediction and optimization factors
Strong written and oral communication skills
Strong troubleshooting and problem-solving skills
Desire to be working with data and helping businesses make better data driven decisions.
At least 1 year experience with AWS technologies: EMR, Glue, RedShift, Athena, S3 etc.
Must be business-centric, data-oriented, analytically minded and results driven
Work with enterprise architect and business to proactively identify opportunities for advanced analytics utilization