Haystack Search Relevance Training - October 29th & 30th

Training to be held after the Haystack EU conference

New to relevance? Want to solidify your foundation? Or take things to an advanced level? OpenSource Connections will be offering their "Think Like a Relevance Engineer" trainings for Solr or Elasticsearch directly following the Haystack EU Conference. These courses give you a foundation for solving applied Information Retrieval problems with Solr or Elasticsearch. The trainings will take place October 29th & 30th.

To purchase tickets, visit the Haystack EU 2019 page.

Training Details

From the team behind Relevant Search - Learn to improve search quality from the best experts in the field. Bust through the mystique around topics like 'cognitive search' and 'Learning to Rank'. In two days, we teach you to measure, tune, and improve search quality. You'll appreciate relevance tuning from TF*IDF, to boosts, to Learning to Rank. You'll begin your journey to search & discovery applications that seem to 'get users' and delivers business results.

'Think Like a Relevance Engineer' has helped me think differently about how I solve Solr & Elasticsearch relevance problems
Matt Corkum, Disruptive Technology Director, Elsevier
What a positive experience! We have so many new ideas to implement after attending 'Think Like a Relevance Engineer' training.
Andrew Lee, Director of Engineering Search, DHI

What you'll get out of it

  • How to measure search quality
  • Removing the fear from relevance experimentation
  • Becoming 'hypothesis driven' in relevance work
  • Elasticsearch relevance tuning techniques
  • Building semantic search
  • Implementing machine learning to improve relevance (ie Learning to Rank)
  • Access to the best experts in the world on Elasticsearch relevance for brainstorming your problems

Day One - Managing, Measuring, and Testing Search Relevance

This day helps the class understand how working on relevance requires different thinking than other engineering problems. We teach you to measure search quality, take a hypothesis-driven approach to search projects, and safely 'fail fast' towards ever improving business KPIs

  • What is search?
  • Holding search accountable to the business
  • Search quality feedback
  • Hypothesis-driven relevance tuning
  • User studies for search quality
  • Using analytics & clicks to understand search quality

Day Two - Engineering Relevance with Elasticsearch|Solr

This day demonstrates relevance tuning techniques that actually work. Relevance can't be achieved by just tweaking field weights: Boosting strategies, synonyms, and semantic search are discussed. The day is closed introducing machine learning for search (aka "Learning to Rank").

  • Getting a feel for Elasticsearch|Solr
  • Signal Modeling (data modeling for relevance)
  • Dealing with multiple, competing objectives in search relevance
  • Synonym strategies that actually work
  • Taxonomy-based Semantic Search
  • Introduction to Learning to Rank

You'll also receive a copy of Relevant Search written by OpenSource Connections CTO Doug Turnbull and OpenSource Connections Alum John Berryman.