Module Contents


PositionalEncoding(d_model, dropout=0.1, max_len=5000)
TK_class(extractor, config) Adapted from
class capreolus.reranker.TK.PositionalEncoding(d_model, dropout=0.1, max_len=5000)[source]

Bases: torch.nn.Module

forward(self, x)[source]
class capreolus.reranker.TK.TK_class(extractor, config)[source]

Bases: torch.nn.Module

Adapted from TK is a neural IR model - a fusion between transformer contextualization & kernel-based scoring -> uses 1 transformer block to contextualize embeddings -> soft-histogram kernels to score interactions

get_mu_matrix(self, extractor)[source]

Returns a matrix of mu values that can be directly subtracted from the cosine matrix. This is the matrix mu in equation 5 in the paper (

get_mask(self, embedding)[source]

Gets a mask of shape (seq_len, seq_len). This is an additive mask, hence masked elements should be -inf

get_embedding(self, toks)[source]

Overrides KNRM_Class’s get_embedding to return contextualized word embeddings

forward(self, doctoks, querytoks, query_idf)[source]
class capreolus.reranker.TK.TK[source]

Bases: capreolus.reranker.Reranker

module_name = TK[source]
description = Sebastian Hofstätter, Markus Zlabinger, and Allan Hanbury. 2019. TU Wien @ TREC Deep Learning '19 -- Simple Contextualization for Re-ranking. In TREC '19.[source]
score(self, d)[source]
test(self, d)[source]