CLEF 2020

From Openresearch
Jump to: navigation, search
CLEF 2020
11th CLEF Conference and Labs of Evaluation Forum
Event in series CLEF
Dates 2020/09/22 (iCal) - 2020/09/25
Homepage: https://clef2020.clef-initiative.eu/
Twitter account: @clef_initiative
Location
Location: Thessaloniki, Online
Loading map...

Important dates
Papers: 2020/05/15
Submissions: 2020/05/15
Notification: 2020/06/14
Camera ready due: 2020/06/28
Committees
General chairs: Evangelos Kanoulas, Theodora Tsikrika, Stefanos Vrochidis, Avi Arampatzis
PC chairs: Hideo Joho, Christina Lioma
Table of Contents
Tweets by @clef_initiative


Online Conference - Originally planned: Thessaloniki, Greece

CLEF 2020 will be an online-only event

CLEF 2020 is the 11th CLEF conference continuing the popular CLEF campaigns which have run since 2000 contributing to the systematic evaluation of information access systems, primarily through experimentation on shared tasks. Building on the format first introduced in 2010, CLEF 2020 consists of an independent peer-reviewed conference on a broad range of issues in the fields of multilingual and multimodal information access evaluation, and a set of labs and workshops designed to test different aspects of mono and cross-language Information retrieval systems. Together, the conference and the lab series will maintain and expand upon the CLEF tradition of community-based evaluation and discussion on evaluation issues.

Call for Papers

Important Dates

(Time zone: Anywhere on Earth)

  • Submission of All Papers: 15 May 2020
  • Notification of Acceptance: 14 June 2020
  • Camera Ready Copy due: 28 June 2020
  • Conference: 22-25 September 2020

Aim and Scope

The CLEF Conference addresses all aspects of Information Access in any modality and language. The CLEF conference includes presentation of research papers and a series of workshops presenting the results of lab-based comparative evaluation benchmarks. CLEF 2020 Thessaloniki is the 11th year of the CLEF Conference series and the 21st year of the CLEF initiative as a forum for information retrieval (IR) evaluation. The CLEF conference has a clear focus on experimental IR as carried out within evaluation forums (e.g., CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, and TAC) with special attention to the challenges of multimodality, multilinguality, and interactive search also considering specific classes of users as children, students, impaired users in different tasks (e.g., academic, professional, or everyday-life) . We invite paper submissions on significant new insights demonstrated on IR test collections, on analysis of IR test collections and evaluation measures, as well as on concrete proposals to push the boundaries of the Cranfield style evaluation paradigm.

All submissions to the CLEF main conference will be reviewed on the basis of relevance, originality, importance, and clarity. CLEF welcomes papers that describe rigorous hypothesis testing regardless of whether the results are positive or negative. CLEF also welcomes past runs/results/data analysis and new data collections. Methods are expected to be written so that they are reproducible by others, and the logic of the research design is clearly described in the paper. The conference proceedings will be published in the Springer Lecture Notes in Computer Science (LNCS).

Topics

Relevant topics for the CLEF 2020 Conference include but are not limited to:


  • Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
  • Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
  • User studies either based on lab studies or crowdsourcing.
  • Past results/run deep analysis both statistically and fine grain based.
  • Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
  • Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
  • Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
  • Interactive Information Retrieval evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
  • Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
  • New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
  • Work on data from rare languages, collaborative, social data.


Aim and Scope

The CLEF Conference addresses all aspects of Information Access in any modality and language. The CLEF conference includes presentation of research papers and a series of workshops presenting the results of lab-based comparative evaluation benchmarks. CLEF 2020 Thessaloniki is the 11th year of the CLEF Conference series and the 21st year of the CLEF initiative as a forum for information retrieval (IR) evaluation. The CLEF conference has a clear focus on experimental IR as carried out within evaluation forums (e.g., CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, and TAC) with special attention to the challenges of multimodality, multilinguality, and interactive search also considering specific classes of users as children, students, impaired users in different tasks (e.g., academic, professional, or everyday-life) . We invite paper submissions on significant new insights demonstrated on IR test collections, on analysis of IR test collections and evaluation measures, as well as on concrete proposals to push the boundaries of the Cranfield style evaluation paradigm.

All submissions to the CLEF main conference will be reviewed on the basis of relevance, originality, importance, and clarity. CLEF welcomes papers that describe rigorous hypothesis testing regardless of whether the results are positive or negative. CLEF also welcomes past runs/results/data analysis and new data collections. Methods are expected to be written so that they are reproducible by others, and the logic of the research design is clearly described in the paper. The conference proceedings will be published in the Springer Lecture Notes in Computer Science (LNCS).

Committee

General Chairs

  • Evangelos Kanoulas, Univ. of Amsterdam, the Netherlands
  • Theodora Tsikrika, Information Technologies Institute, CERTH, Greece
  • Stefanos Vrochidis, Information Technologies Institute, CERTH, Greece
  • Avi Arampatzis, Democritus University of Thrace, Greece

Program Chairs

  • Hideo Joho, University of Tsukuba, Japan
  • Christina Lioma, University of Copenhagen, Denmark

Lab Chairs

  • Aurélie Névéol, LIMSI, CNRS, France
  • Carsten Eickhoff, Brown University, USA

Lab Mentorship Chair

  • Lorraine Goeuriot, Université Grenoble Alpes, France

Proceedings Chair

  • Linda Cappellato, University of Padua, Italy
  • Nicola Ferro, University of Padua, Italy


Konferenz findet jetzt online statt. Frage: Ist es richtig, den Ort zu entfernen und dort zu "Virtual event" zu schreiben? Oder: "Online-Conference" oder "Video-conference"? Sollen die ursprünglichen Tagungsdaten: Ort + Land im freien Feld festgehalten werden, z.B.: "Originally planned: Ort, Land?" - welche Form ist gewünscht? Britta Seeberg