Remote
0
USD
/hr

4 Apache Spark contracts

PySpark Developer with OCR experience

1 month ago
$60 - $75/hour (wellpaid.io estimate)RemoteJohn Snow Labs
Job details
Job Type
Full-time
Part-time
Contract
Full Job Description
Company Description

John Snow Labs is an award-winning AI and NLP company, accelerating progress in data science by providing state-of-the-art software, data, and models. Founded in 2015, it helps healthcare and life science companies build, deploy, and operate AI products and services. John Snow Labs is the winner of the 2018 AI Solution Provider of the Year Award, the 2019 AI Platform of the Year Award, the 2019 International Data Science Foundation Technology award, and the 2020 AI Excellence Award.

John Snow Labs is the developer of Spark NLP - the world’s most widely used NLP library in the enterprise - and is the world’s leading provider of state-of-the-art clinical NLP software, powering some of the world’s largest healthcare & pharma companies. John Snow Labs is a global team of specialists, of which 33% hold a Ph.D. or M.D. and 75% hold at least a Master’s degree in disciplines covering data science, medicine, software engineering, pharmacy, DevOps and SecOps.


Job Description

This is an opportunity for a superstar PySpark developer with proven knowledge of Spark, Data Science, Big Data and great communication skills. We are the team developing the Spark NLP and Spark OCR libraries – and are looking to grow the team building these libraries as well as helping customers use these libraries in their projects.

More details about the project are available here: https://nlp.johnsnowlabs.com/docs/en/ocr

This is a career opportunity that will enable you to expand your knowledge and experience of different tools and techniques, work with an international team of big data and data science experts, and make a positive impact on your work. If you qualify and are interested, please include the words 'John Snow Labs' in your cover letter and explain in detail why you are the best fit for this role.


Qualifications
  • Python (3+ years)
  • Apache Spark (3+ years)
  • Data Science (Python, Jupyter, TensorFlow)
  • OCR (using open-source tools such as Tesseract is a plus)
  • Image processing expertise is a plus
  • Scala/Java experience would be a big plus


Additional Information
  • We are a fully virtual company, collaborating across 22 countries.
  • Open to candidates worldwide - work remotely from anywhere.
  • This is a contract opportunity, not a full-time employment role.
  • This role requires the availability of at least 30 hours per week.
Get new remote Apache Spark contracts sent to you every week.
Subscribed to weekly Apache Spark alerts! 🎉

Data Developer GCP - Remote Working Contract

11 days ago
$600 - $650/dayRemoteOutside IR35Client Server

Data Developer GCP - Remote Working Contract

Posted 1 day ago 2365

London, England, £600 - £650 per day

Managed by:

Data Developer (Google Cloud Platform Scala) *Remote Contract outside IR35* Media/ publishing group with international acclaim is seeking a Data Developer with a wealth of GCP expertise to join their team.

You'll take ownership of building data pipelines that have a direct impact on business decision making, in conjunction with maintaining associated analytics tools, insights and business performance metrics.

You'll implement best security practices/ new controls across the GCP estate, building a new generation of tooling to support data engineering projects with key focuses on security, optimal extraction, transformation and loading of data.

The modern cloud-centric tech stack includes Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, GCP and AWS, improving access to data and data insights as well optimising huge datasets.

This company can offer a remote interview/ onboarding process as well as 100% work from home throughout the term of the contract (occasional visits to London office may be required)

Requirements:
*You have extensive GCP knowledge and hands-on experience of securing GCP projects and applications, implementing security best practice including networking, VPC security controls, security logging / auditing and IAM
*Strong Scala and Python development skills
*Apache Spark experience
*Data warehousing experience; BigQuery, DBT, Apache Airflow
*Comfortable handling large datasets/ Data Lake
*Excellent communication skills, work well in a team

If you're interested in this Data Developer (Google Cloud Platform Scala) contract opportunity, apply now or call today to find out more.

Job Role: Data Engineer (GCP); Location: Remote WfH; Rate: £600 to £650 p/day (Outside IR35); Term: 6 months; Start: Immediate / ASAP

REF: BB/15976/D/JR/220221_1614012928

Share this Job

Data Developer GCP - Remote Working Contract

12 days ago
£600 - £650/dayGreat rateRemoteOutside IR35Client Server Ltd.
Data Developer (Google Cloud Platform Scala) *Remote Contract outside IR35* Media/ publishing group with international acclaim is seeking a Data Developer with a wealth of GCP expertise to join their team.

You'll take ownership of building data pipelines that have a direct impact on business decision making, in conjunction with maintaining associated analytics tools, insights and business performance metrics.

You'll implement best security practices/ new controls across the GCP estate, building a new generation of tooling to support data engineering projects with key focuses on security, optimal extraction, transformation and loading of data.

The modern cloud-centric tech stack includes Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, GCP and AWS, improving access to data and data insights as well optimising huge datasets.

This company can offer a remote interview/ onboarding process as well as 100% work from home throughout the term of the contract (occasional visits to London office may be required)

Requirements:
*You have extensive GCP knowledge and hands-on experience of securing GCP projects and applications, implementing security best practice including networking, VPC security controls, security logging / auditing and IAM
*Strong Scala and Python development skills
*Apache Spark experience
*Data warehousing experience; BigQuery, DBT, Apache Airflow
*Comfortable handling large datasets/ Data Lake
*Excellent communication skills, work well in a team

If you're interested in this Data Developer (Google Cloud Platform Scala) contract opportunity, apply now or call today to find out more.

Job Role: Data Engineer (GCP); Location: Remote WfH; Rate: £600 to £650 p/day (Outside IR35); Term: 6 months; Start: Immediate / ASAP

Big Data Developer Scala - Remote Contract

28 days ago
$600 - $650/dayRemoteClient Server

Big Data Developer Scala - Remote Contract

Posted 2 days ago 2365

London, England, £600 - £650 per day

Managed by:

Big Data Engineer (Scala Spark GCP Big Data) *Remote Interview WfH*. Data savvy Software Developer sought to work on new innovations for a technology savvy media / publishing group.

As a Big Data Developer you will create and maintain the data pipeline and associated analytics tools, insights and business performance metrics; you'll be responsible for building the infrastructure for optimal extraction, transformation and loading of data.
You'll be working with the latest technology on a modern cloud based technology stack encompassing Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, AWS and GCP, improving access to data and data insights as well optimising huge datasets.

The company can currently offer a remote interview and onboarding process as well as the ability to work from home throughout the term of the contract (although occasional visits to the London office could be required).

Requirements:
*Strong Scala development skills, ideally also Python
*Strong Apache Spark experience
*Strong knowledge of Cloud technologies: GCP and AWS primarily
*Experience with Data warehouse technologies including BigQuery, DBT and Apache Airflow
*Used to dealing with large datasets / Data Lake
*Excellent communication / collaboration skills