You will leverage organizational data to generate meaningful insights that improve the experiences of our users, while also being able to improve the effectiveness and efficiency of core business functions. One key requirement will be the specification, testing, validation, and maintenance of predictive and prescriptive models in both the product and the organizational spaces.

The ideal candidate will be adept at diving deeply into business problems, understanding where opportunities exist for advanced analytics, and injecting fully-scoped and tested models into our product space and business. Candidates should have a bottomless curiosity, a keen desire to solve real problems, ability to translate complex concepts to non-technical stakeholders, and willingness to master unfamiliar domains and methods. You think like a business owner. You are excited to jump in and help your teammates, and you’re always thinking of ways to constructively challenge the status quo and improve the business. You are an A+ player who is inspired by the mission we are pursuing and by the opportunity to define a new category in an incredibly fast-moving market. You are a creative problem solver, who strives for excellence, but knows how to move from explore to exploit. You are excited to bring your best everyday, to learn from those around you and to push hard while contributing to our powerful vision of positively impacting kids’ lives everyday.

At HopSkipDrive, we know that to tackle our toughest challenges, we need different approaches, unique perspectives, and new ways of thinking.  We are building a team of creative problem-solvers from many different backgrounds.

 

What you’ll do: 

  • Combine DS strategies, programming expertise, and theoretical understanding of techniques to deliver advanced ML solutions for problem solving across the enterprise.
  • Scope, explore, specify, test, and deploy, and observe accurate and stable models into production environments.
  • Utilize modern, open-source data scientific workflows (SQL, R/Python, Github) to create highly accurate or explainable predictive/prescriptive models.
  • Own models in production and re-train when necessary
  • Demonstrate a product mindset to deliver portable components that can be deployed into existing products OR converted into stand-alone data products
  • Partner closely with BI, Engineering, Product Management, DevOps, and business stakeholders to create business-aligned, stable workflows.
  • Communicate results and model specifications to appropriate stakeholders.
  • Mentor, train, and assist aspiring data science professionals

 

Skills and Experience:

  • 6+ years of hands-on experience building and deploying production quality predictive and/or prescriptive models.
  • Strong data acquisition skills using SQL in modern technical stacks (PostGres, AWS, Snowflake)
  • 6+ years of post-graduate experience working with open source data science toolkits including Python (numpy, pandas, scikit, PySpark) or R (Tidyverse/Tidymodels). Strong preference for candidates who can capably translate between Python and R.
  • 2+ years experience in deploying models to production environments through APIs, or batch processes
  • 3+ years experience in at least TWO of the following DS methods
    • Supervised and Unsupervised ML (regression, classification, etc.)
    • Econometric modeling (e.g.  time-to-event/survival analysis)
    • Reinforcement learning (e.g multiarm bandits for price or intervention optimization)
    • Operations research (e.g. combinatorial optimization/ traveling salesman, linear programming)
  • Demonstrated experience with reproducible data science environments including creation of virtual environments, library management, and version control using Git/Github
  • Demonstrated experience writing up results using literate programming workflows (Quarto preferred).

 

Nice to have experience: 

  • Masters or PhD in Data Science, Operations Research, Applied Math, Machine Learning, Computer Science, AI, Statistics, Engineering, or similar
  • Experience with back end model deployment and monitoring (MLOPS) using cloud ecosystems (Google, AWS, Snowflake).
  • Experience with “big data” solutions requiring cloud-infrastructure or distributed compute
  • Experience building in, and potentially standing up, cloud-based data science environments (Posit Workbench, DataBricks, Sagemaker)
  • Experience doing data science work in the transportation, logistics, or delivery spaces
  • Experience with Transformers for analyzing large sequences of data
Job Overview
Job alerts

Subscribe to our weekly job alerts below and never miss the latest jobs

Sign in

Sign Up

Forgotten Password

Job Quick Search

Cart

Basket

Share