Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Evaluate with Indri, implement with ElasticSearch: One approach would be to continue to use Indri as the evaluation framework and translate results to implementation decisions in ElasticSearch.  This isn't ideal, but seems like it might be the best
  • Evaluate with Lucene, implement with ElasticSearch: Since ES is Lucene-based, we could develop an evaluation framework around Lucene. ir-tools already has some of the ingredients. This would allow us to work within some of the constraints of ElasticSearch (i.e., specific model implementations).

Evaluation with Lucene

A very relevant workshop report from SIGIR: Lucene4IR: Developing Information Retrieval Evaluation Resources using Lucene.


Other notes

Re-reading Zhai's SLMIR, noticed different ranges for Okapi BM25 parameters.

  • k1 in  1.0-2.0
  • b 0.75
  • k3 0-1000