Save this contract

Mind providing us with your email so we can save this contract? We promise we won't spam you, and you can unsubscribe any time.
Saved this job! You can see it in saved.
Saved this job! You can see it in saved.

Remote Data Modelling contract jobs

Remote
0
USD
/hr

5 remote Data Modelling contracts

Java AWS Developer : W2 only

16 days ago
$60 - $80/hourRemoteNeev Systems LLC

Title : Java Developer

Location : Atlanta, GA
Duration : 24+ Months
Start date : ASAP

Key skills required for the job are:

  • Self-starter, seasoned JAVA developer (10+ years ) with working experience with Microservices, Docker, Kubernetes.
  • DevOps experience ( UNIX scripting tools, ansible or chef or puppet , etc.) is required. Groovy knowledge is desired.
  • Working experience with various AWS components and cloud based development is required (5+ years).
  • Hands-on experience in Virtual Assistant/Bot, & NLP/NLU technology is desired, but not required. Machine learning, data modelling and mining is desired.
  • Excellent critical thinking skills, combined with the ability to present your ideas clearly in both verbal and written form.
  • Self-motivated, Self-starter, quick learner, and team player and easy to work with is a must.
  • Work is based in Atlanta office. Candidate can work remotely in current pandemic situation, with daily check-ins with other member of the team working on the project.

Job Types: Full-time, Contract

Pay: $60.00 - $80.00 per hour

Experience:

  • AWS: 1 year (Preferred)
  • spring: 1 year (Preferred)
  • Java: 1 year (Preferred)
Get new remote Data Modelling contracts sent to you every week.
Subscribed to weekly Data Modelling alerts! 🎉

Remote Analytics Engineer

1 month ago
$20 - $75/hourRemoteFluentU

Are you an analytics engineer looking to take the next step in your career?

Are you looking for a company where DBT is central to their data workflow?

Are you looking for a company where remote is built into the company DNA, and not coronavirus-induced?

Do you want to work at a company where the leadership is engaged with BI and sees it as a strategic priority?

Do you want a central role in crafting the BI strategy, and building a data team?

Do you want to reach and make a positive impact on millions of people?

This job might be for you if you:

  • like being able to set your own hours and work from home
  • like exercising your creativity and experimentation
  • like having responsibility
  • like working collaboratively
  • like having a dependable, reliable stream of work
  • want to make the world a better place
  • are comfortable with a fast-changing environment.

You should NOT take this job if you:

  • have a strong need/desire for in-person social interaction at work
  • dislike asynchronous communication
  • like following instructions and being told what to do
  • don't like needing to come up with ideas
  • don't have a real interest or experience in online education.

ABOUT US

FluentU is an online education company that helps people learn languages with real-world videos, including movie trailers, music videos, news and inspiring talks. We have a website, iOS app (usually among the top 50 grossing iOS education apps), and Android app. Founded in 2011, we’re a profitable, stable company with long-term focus, and we’re proudly self-funded. And we've been remote/distributed since day one.

We get 5 million visitors per month on our blogs, 100,000+ people on our email list, and many more receiving web and mobile notifications.

This is a unique opportunity to play a pivotal role on our business intelligence strategy, which is still in the early stages. We've worked with a reputable consultancy until now, and you'll be our first dedicated hire focused specifically on BI data engineering, empowered to build a program from the foundation up.

JOB DESCRIPTION

As our first analytics engineer, you will be responsible for:

  • working in an agile/kanban style methodology while balancing long-term data modeling & infrastructure planning
  • interpreting & executing analytics feature requests from senior stakeholders
  • planning & documenting work to be done, with regular feedback to stakeholders to minimize unnecessary work
  • maintaining & improving the entire data pipeline from start to finish, from extracting data from SaaS API's to configuring display options/creating charts for end users in Looker, eg:
  • using data-extractor-as-a-service tools such as Stitch to extract data on a scheduled basis & oversee automated loading behavior
  • Creating & maintaining proprietary extraction tools, such as Bash scripts on Google Cloud instances
  • managing the DBT ETL pipeline
  • acting as Looker Admin (development, security & administration)
  • Using software engineering best-practices (such as version control, component-based software/architecture, DRY code etc).
  • Implementing testing frameworks & subjecting all work to QA

You would also work directly with the founder (https://www.linkedin.com/in/alancpark/) in this position.

HOW WE WORK

We’re a 100% distributed/remote team. Here’s a little bit more about how we work:

  • Almost all of our communication is text-based (mostly via Asana) and we value clear communication (https://app.tettra.co/teams/fluentu/pages/communication-guidelines), among other things (https://app.tettra.co/teams/fluentu/pages/mission-and-operating-principles).
  • Most things are not urgent. We take pride in having a calm work environment.
  • We also have a flat collaborative environment.
  • We make decisions based on logic/reason.
  • We believe in getting things done and continuous improvement.

QUALIFICATIONS

Our ideal candidate:

  • can work in a fast-paced environment and be responsive to new requirements
  • is terrific at written communication
  • can explain technical concepts to a non-technical audience
  • understands basic principles around content marketing
  • is comfortable managing their own time and workflow independently
  • can juggle multiple projects & work streams concurrently
  • has most or all of the following technical skills:
  • experience working collaboratively using Git
  • has strong SQL skills (we use BigQuery Standard SQL)
  • understands data modelling concepts, à la Kimball dimensional modelling
  • exceptional understanding of data manipulation (ie joins, data granularity, referential integrity, uniqueness/primary/foreign keys etc)
  • some knowledge of sql database performance
  • Looker, dbt & google cloud knowledge (can be easily learned if you possess the above skills)
  • ideally linux/cloud architecture & python skills but these are not mandatory.
  • has a deep interest in language learning or online education.
  • is able to work a minimum of 20 hours per week (pay is hourly - https://app.tettra.co/teams/fluentu/pages/why-we-have-hourly-pay-and-how-it-works-in-practice) and is looking for something long-term.

HOW TO APPLY

To apply, please fill out this form (if you don't fill out this form your application will be rejected):

https://form.asana.com/?hash=7898da2ae31a0f9160deab764d1418e1ed952dbe2da4c48c6841df5852190658&id=1178425944784297

Job Types: Full-time, Contract

Pay: $20.00 - $75.00 per hour

Schedule:

  • Monday to Friday

Contract Length:

  • More than 1 year

Full Time Opportunity:

  • Yes

Work Location:

  • Fully Remote

Company's website:

  • fluentu.com

Work Remotely:

  • Yes

BI Engineer - Remote PositionNA

3 days ago
RemoteInfotree Service Inc
Company Description

Infotree’s approach to every employee and customer is based around making a positive impact. We focus on over-servicing, continuous improvement and a high-quality culture. We’re passionate about making successful matches for our employees and customers across the globe. Infotree prides itself in our proven track record and innovative culture with 100% focus on the employees and customers


Job Description

Reach out to me on 734-400-0958 OR abbas@infotreeglobal.com
W2 OR 1099 Only

Job Description
Job title: BI Engineer
Location: Remote
Duration: 6+ Months Contract

MUST HAVES:
1. 3+ years of experience in building ETL pipeline using Spark for moving data in/out of Data Lake.
2. Must have Very Strong knowledge of SQL.
3. 3+ years of experience with PowerBI development (PowerBI certifications preferred)

ETL Skills:
  • 3+ years of experience in building ETL pipeline using Spark for moving data in/out of Data Lake.
  • Must have Very Strong knowledge of SQL.
  • Good experience in big data technologies like Spark, Python, Scala.
  • In-depth understanding of Datawarehouse, Data Modelling designs concepts.
  • Experience in any cloud technology like Azure(preferred),AWS or GCP.
  • Experience in Airflow or any other scheduling tool .
  • Experience working in Agile methodology.
  • Adds an advantage with any below experience:
  • Experience with real time data ingestion using Kafka.
  • Experience with CI/CD pipelines with Jenkins ,Bitbucket ,GitHub etc.
  • Experience with Data Analysis.
  • Experience with reporting tools like PowerBI ,MicroStrategy ,Tableau Etc.
PowerBI Skills:
  • 3+ years of experience with PowerBI development (PowerBI certifications preferred).
  • Experience with other reporting tools like MicroStrategy, Tableau or equivalent software.
  • Very Strong in creating complicated DAX functions and Modeling in PowerBI.
  • In depth understanding of Datawarehouse design.
  • Strong in SQL scripting .
  • Experience in any database like Snowflake, SQL Server, Oracle Etc.
  • Experience with Data Analysis.
  • Adept in developing, publishing and scheduling Power BI reports as per the business requirements.
  • Handing live production support .
  • Flexibility with On call rotation.
  • Experience working in Agile methodology.
  • Adds an advantage with any below experience
  • Experience in REST API.
  • Experience in creating real time datasets
  • Knowledge and skills for secondary tools such as Microsoft Azure, SQL data warehouse, Visual Studio, etc

Qualifications

null

Additional Information

All your information will be kept confidential according to EEO guidelines.

Senior Data Engineer / Database Development - W2

1 month ago
$60 - $75/hour (Estimated)RemoteInfotree Service Inc
Company Description

Infotree’s approach to every employee and customer is based around making a positive impact. We focus on over-servicing, continuous improvement and a high-quality culture. We’re passionate about making successful matches for our employees and customers across the globe. Infotree prides itself in our proven track record and innovative culture with 100% focus on the employees and customers


Job Description

Position Title: Senior Data Engineer (W2 only)

Location: Louisville, KY

Duration: 6+ months (With a possibility of extension)

Initially It’s a Remote position, once the Pandemic is over it will be onsite in Louisville, KY.

Job Description:

Senior Data Engineers at the Edge will help develop cloud native solutions, leveraging Google Cloud Platform APIs and Services and are able to code in languages and frameworks that fit into that ecosystem – Python, Golang, Node, .Net Core (C#), and Java

Candidate will work collaboratively with architects and other engineers to recommend, prototype, build and debug data infrastructures on Google Cloud Platform (GCP). He will design data models, build APIs, devise the data pipelines and figure out the scaling strategy for our data stores. Working closely with our product and interoperability team, you will focus on building data stores, APIs and microservices from the ground up that help power the rich and intuitive user experience following an agile work environment governed through SAFe

Core Competencies:

  • Atleast 6+ years of Database development experience
  • Strong experience specifically with NOSQL
  • Good experience with any of these: MongoDB, Firestore, DynamoDB, Cassandra
Required Qualifications:
  • At least one year of experience working on any major cloud provider
  • Proven work experience as a Software Engineer and/or Database Developer
  • Hands-on experience building production-grade data solutions (relational and NoSQL)
  • Good knowledge in Data modelling including but not limited to table, constraints and relationships but also optimum store and fetch of the data
  • 5+ years of Database Development experience in at least one of the following databases – PostgreSQL, MySQL, SQL Server, Oracle etc
  • 1+ year of experience with at least one popular NoSQL data store - MongoDB, Firestore, DynamoDB, Cassandra etc
  • Prior experience or ownership of the end-to-end data-engineering component of the solution
  • Proven experience optimizing existing pipelines and maintain all domain-related data pipelines
  • Analytical mind with problem-solving aptitude
  • Strong communication skills with ability to interact with business and customer representatives
  • Passion for growing your skills, tackling interesting work and challenging problems
  • BA/BS in Computer Science or related field, or equivalent experience
  • Experience working within an environment with a “startup” culture using agile, lean, DevOps, and DataOps delivery practices and methodologies.

Role Essentials:
  • Have experience with TDD and writing solid test cases
  • Experience working in large, high-quality codebases
  • Have experience running and maintaining containers and using tools such as Docker, Kubernetes, or Mesos in Production
  • Shipping pragmatic, sustainable code bases with speed
  • Good understanding of public cloud computing architectures and services. Experienced in the use of cloud native technologies, cloud cybersecurity, and implementation patterns to lower costs, improve speed to
market, increase efficiency, and enable innovation
  • Experience leveraging modern technologies to increase velocity and decrease cost of solution delivery;
including cloud technologies, microservices architecture, and streaming analytics
  • Building collaborative relationships with team members, fostering a productive team environment, and coaching staff with timely meaningful feedback

If interested, please share the most recent resume on rohanjeet@infotreeglobal.com or call at 817 415-8935 for quicker response. (W2 only, no C2C allowed)


Role Desirables:
  • Cloud certification on any major cloud provider
  • Experienced in designing, building, and testing complex scalable systems
  • Have built or maintained a large-scale microservice infrastructure application or system with petabyte scale data stores
  • Experience supporting live production infrastructure, can put out fires under pressure when things go wrong
  • An appetite for data: analyzing metrics and designing A/B tests to help drive the company’s decision

Qualifications

null

Additional Information

All your information will be kept confidential according to EEO guidelines.

Data Scientist Marketing Mix Modelling

3 days ago
£500 - £700/dayRemoteInside IR35McGregor Boyall

Data Scientist (Marketing Mix Modelling)

Marketing, Mix Modelling, Econometrics, Attribution Modelling, Data Science, Machine Learning, Python, R, Maths, Statistics

An excellent oppprtunity has arisen for a Data Scientist specialising in Marketing Mix Modelling, to work for a Tier 1 Banking client based in London. This is a contract role (inside ir35) and will be remote working for the foreseeable future.

The client are looking for two candidates at varying levels of seniority.

Key skills/experience required:

  • Bachelor's/Master's degree/ PHD in Data Science, Mathematics and Statistics, Computer Science, or Engineering related field
  • Strong experience in Data Scientist role at a high-performing, fast-paced industry (desirable from Technology Leaders, Media Agency, FMCG, Banking, Global Market Leaders)
  • Essential to have strong experience with Marketing Mix Modelling, Attribution Modelling or Econometrics
  • Experience of designing and building machine learning models
  • Hands-on experience of developing marketing mix modelling and multi-touch attribution
  • Experience with statistical software (e.g. R, Python) and database languages (e.g. SQL)
  • Ability to communicate effectively to senior stakeholders; strong written and verbal communication skills

Please respond with an up to date CV for further information -

McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.