Skip to the content.

NTCIR 2026 Tip-of-the-Tongue (ToT) Shared Task

Welcome to the guidelines for the upcoming 2026 edition of the NTCIR ToT shared task!

Guidelines

Important dates

Registration

Organizations wishing to participate in NTCIR 2026 must register.

Any questions about task registration must be sent to ntc-secretariat (at) nii.ac.jp. Task specific enquiries must be directed to diazf (at) acm.org.

Task definition

In terms of input and output, the ToT known-item identification task is relatively straightforward—given an input ToT request, output a ranked list of items. So, each item can be any entity and must be identified by its Wikipedia page id and the correct item should be ranked as high as possible. For each query, runs should return a ranked list of 1000 Wikipedia page ids. Runs will be evaluated using IR metrics that are appropriate for IR tasks with one relevant document, such as discounted cumulative gain, reciprocal rank, and success@k.

The retrieval task is multilingual. Separate datasets will be provided to participants in English, Chinese, Japanese, and Korean. Participants can submit runs for one or more languages.

Datasets

(Coming soon!)

Submission and evaluation

Submission form: Coming soon! (You must register as a participant to submit a run).

All submissions should be in the following runfile format. White space is used to separate columns. The width of the columns in the format is not important, but it is important to have exactly six columns per line with at least one space between the columns.

1 Q0 pid1    1 2.73 runid1
1 Q0 pid2    2 2.71 runid1
1 Q0 pid3    3 2.61 runid1
1 Q0 pid4    4 2.05 runid1
1 Q0 pid5    5 1.89 runid1

, where:

Runs will be evaluated using metrics appropriate for retrieval scenarios with one relevant document. In particular, our primary evaluation metric for this year’s track will be discounted cumulative gain (DCG) but we may also compute other metrics such as reciprocal rank (RR) and success@k.