John Snow Labs is an award-winning AI and NLP company, accelerating progress in data science by providing state-of-the-art software, data, and models. Founded in 2015, it helps healthcare and life science companies build, deploy, and operate AI products and services. John Snow Labs is the winner of the 2018 AI Solution Provider of the Year Award, the 2019 AI Platform of the Year Award, the 2019 International Data Science Foundation Technology award, and the 2020 AI Excellence Award.
John Snow Labs is the developer of Spark NLP - the world’s most widely used NLP library in the enterprise - and is the world’s leading provider of state-of-the-art clinical NLP software, powering some of the world’s largest healthcare & pharma companies. John Snow Labs is a global team of specialists, of which 33% hold a Ph.D. or M.D. and 75% hold at least a Master’s degree in disciplines covering data science, medicine, software engineering, pharmacy, DevOps and SecOps.
This is an opportunity for a superstar PySpark developer with proven knowledge of Spark, Data Science, Big Data and great communication skills. We are the team developing the Spark NLP and Spark OCR libraries – and are looking to grow the team building these libraries as well as helping customers use these libraries in their projects.
More details about the project are available here: https://nlp.johnsnowlabs.com/docs/en/ocr
This is a career opportunity that will enable you to expand your knowledge and experience of different tools and techniques, work with an international team of big data and data science experts, and make a positive impact on your work. If you qualify and are interested, please include the words 'John Snow Labs' in your cover letter and explain in detail why you are the best fit for this role.
Data Developer GCP - Remote Working ContractPosted 1 day ago 2365
London, England, £600 - £650 per day
Data Developer (Google Cloud Platform Scala) *Remote Contract outside IR35* Media/ publishing group with international acclaim is seeking a Data Developer with a wealth of GCP expertise to join their team.
You'll take ownership of building data pipelines that have a direct impact on business decision making, in conjunction with maintaining associated analytics tools, insights and business performance metrics.
You'll implement best security practices/ new controls across the GCP estate, building a new generation of tooling to support data engineering projects with key focuses on security, optimal extraction, transformation and loading of data.
The modern cloud-centric tech stack includes Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, GCP and AWS, improving access to data and data insights as well optimising huge datasets.
This company can offer a remote interview/ onboarding process as well as 100% work from home throughout the term of the contract (occasional visits to London office may be required)
*You have extensive GCP knowledge and hands-on experience of securing GCP projects and applications, implementing security best practice including networking, VPC security controls, security logging / auditing and IAM
*Strong Scala and Python development skills
*Apache Spark experience
*Data warehousing experience; BigQuery, DBT, Apache Airflow
*Comfortable handling large datasets/ Data Lake
*Excellent communication skills, work well in a team
If you're interested in this Data Developer (Google Cloud Platform Scala) contract opportunity, apply now or call today to find out more.
Job Role: Data Engineer (GCP); Location: Remote WfH; Rate: £600 to £650 p/day (Outside IR35); Term: 6 months; Start: Immediate / ASAP
Share this Job
Big Data Developer Scala - Remote ContractPosted 2 days ago 2365
London, England, £600 - £650 per day
Big Data Engineer (Scala Spark GCP Big Data) *Remote Interview WfH*. Data savvy Software Developer sought to work on new innovations for a technology savvy media / publishing group.
As a Big Data Developer you will create and maintain the data pipeline and associated analytics tools, insights and business performance metrics; you'll be responsible for building the infrastructure for optimal extraction, transformation and loading of data.
You'll be working with the latest technology on a modern cloud based technology stack encompassing Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, AWS and GCP, improving access to data and data insights as well optimising huge datasets.
The company can currently offer a remote interview and onboarding process as well as the ability to work from home throughout the term of the contract (although occasional visits to the London office could be required).
*Strong Scala development skills, ideally also Python
*Strong Apache Spark experience
*Strong knowledge of Cloud technologies: GCP and AWS primarily
*Experience with Data warehouse technologies including BigQuery, DBT and Apache Airflow
*Used to dealing with large datasets / Data Lake
*Excellent communication / collaboration skills