Haystack Search Relevance Training - April 22nd and 23rd

Training details to be held prior to Haystack conference

New to relevance? Want to solidify your foundation? Or take things to an advanced level? OpenSource Connections will be offering their "Think Like a Relevance Engineer" and Learning to Rank training prior to the conference

Available Pre-Conference Training:


New to relevance? Want to solidify your foundation? Or take things to an advanced level? OpenSource Connections will be offering their "Think Like a Relevance Engineer" trainings for Solr or Elasticsearch prior to the Haystack Conference. These courses give you a foundation for solving applied Information Retrieval problems with Solr or Elasticsearch. OSC's Learning to Rank training is also available. The trainings will take place April 22 and 23, 2019 at OpenSource Connections offices and the Old Metropolitan Hall in Charlottesville, Virginia.

Training Details

From the team behind Relevant Search - Learn to improve search quality from the best experts in the field. Bust through the mystique around topics like 'cognitive search' and 'Learning to Rank'. In two days, we teach you to measure, tune, and improve search quality. You'll appreciate relevance tuning from TF*IDF, to boosts, to Learning to Rank. You'll begin your journey to search & discovery applications that seem to 'get users' and delivers business results.

'Think Like a Relevance Engineer' has helped me think differently about how I solve Solr & Elasticsearch relevance problems
Matt Corkum, Disruptive Technology Director, Elsevier
What a positive experience! We have so many new ideas to implement after attending 'Think Like a Relevance Engineer' training.
Andrew Lee, Director of Engineering Search, DHI

What you'll get out of it

  • How to measure search quality
  • Removing the fear from relevance experimentation
  • Becoming 'hypothesis driven' in relevance work
  • Elasticsearch relevance tuning techniques
  • Building semantic search
  • Implementing machine learning to improve relevance (ie Learning to Rank)
  • Access to the best experts in the world on Elasticsearch relevance for brainstorming your problems

Day One - Managing, Measuring, and Testing Search Relevance

This day helps the class understand how working on relevance requires different thinking than other engineering problems. We teach you to measure search quality, take a hypothesis-driven approach to search projects, and safely 'fail fast' towards ever improving business KPIs

  • What is search?
  • Holding search accountable to the business
  • Search quality feedback
  • Hypothesis-driven relevance tuning
  • User studies for search quality
  • Using analytics & clicks to understand search quality

Day Two - Engineering Relevance with Elasticsearch|Solr

This day demonstrates relevance tuning techniques that actually work. Relevance can't be achieved by just tweaking field weights: Boosting strategies, synonyms, and semantic search are discussed. The day is closed introducing machine learning for search (aka "Learning to Rank").

  • Getting a feel for Elasticsearch|Solr
  • Signal Modeling (data modeling for relevance)
  • Dealing with multiple, competing objectives in search relevance
  • Synonym strategies that actually work
  • Taxonomy-based Semantic Search
  • Introduction to Learning to Rank

You'll also receive a copy of Relevant Search written by OpenSource Connections CTO Doug Turnbull and OpenSource Connections Alum John Berryman


Learning to Rank Training

In this two day training course offered April 22nd and 23rd, we go hands on with machine learning tools to improve relevance. Using an open source search engine, we see our models come to life, see the pitfalls of letting machines control relevance, and work to mitigate those pitfalls. From Hello World to Learning to Rank Lessons you can avoid learning the hard way. See how you can apply open source tooling to optimize your search results with machine learning. Ideal for production focused data scientists, relevance engineers, machine learning engineers, or search developers. Some familiarity with a search engine is expected.

Day One: Hands on Basics

We get hands on with movie data, and a simple judgment list. We see where things can go wrong!

  1. Search Relevance as an Machine Learning Problem
  2. Cutting your Teeth With Your First Model
  3. What's Wrong with My Judgments?
  4. Iterating on Features
  5. Choosing the Best Learning to Rank Model

Day Two: Real-World Learning to Rank

At scale, in a real search applications, here are the concerns that will rear themselves.

  1. Dealing with Presentation Bias
  2. Including Non-User Relevance Concerns (Business Rules and Marketplace concerns)
  3. Model Verification, Checks, and Balances
  4. Personalization and Recommendations with Learning to Rank
  5. Including Embeddings and Other Exotic Features
  6. The Next Frontiers of ML and Search

Community Training & Hacking Welcome!

Have training or a hackathon you'd like to offer prior to the conference? Contact Us to discuss how we can help you find a venue and advertise your event!