This repository contains the information for the teams participating in the CLEF LongEval Task 1. LongEval-Retrieval.
The repository is organised as follows:
Submissions: This folder is used to contain information about the submissions.
Scores: This folder will contain the performance scores of the submitted runs.
Resources: This folder contains the resources needed for doing the submissions.
There are two sub-tasks organized in LongEval-Retrieval. Each team can participate in one or both subtasks.
Sub-task A, short-term persistence. In this task, participants will be asked to examine the retrieval effectiveness when the test documents are dated within a 2 months period from the documents available in the train collection.
Sub-task B, long-term persistence. In this task, participants will be asked to examine retrieval effectiveness on the documents published after at least 3 months after the documents in the train collection were published.