Difference between revisions of "CLEF 2021"

From Openresearch
Jump to: navigation, search
(Topics)
(Topics)
Line 26: Line 26:
 
*  User studies either based on lab studies or crowdsourcing.
 
*  User studies either based on lab studies or crowdsourcing.
 
*  Past results/run deep analysis both statistically and fine grain based.
 
*  Past results/run deep analysis both statistically and fine grain based.
    Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
+
*    Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
 
*  Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
 
*  Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
 
*  Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
 
*  Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
Line 32: Line 32:
 
*  Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
 
*  Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
 
*  New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
 
*  New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
*  Work on data from rare languages, collaborative, social data.
+
*  Work on data from rare languages, collaborative, social data.

Revision as of 15:14, 1 April 2021

CLEF 2021
11th CLEF Conference and Labs of Evaluation Forum
Event in series CLEF
Dates 2021/09/21 (iCal) - 2021/09/24
Homepage: http://clef2021.clef-initiative.eu/
Twitter account: @clef_initiative
Submitting link: https://www.easychair.org/conferences/?conf=clef2021
Location
Location: Bucharest, Romania
Loading map...

Important dates
Papers: 2021/05/03
Submissions: 2021/05/03
Notification: 2021/06/04
Camera ready due: 2021/06/25
Committees
General chairs: Bogdan Ionescu, K. Selcuk Candan
PC chairs: Henning Müller, Lorraine Goeuriot, Birger Larsen
Table of Contents

Contents

Tweets by @clef_initiative


Topics

Relevant topics for the CLEF 2021 Conference include but are not limited to:

  • Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
  • Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
  • User studies either based on lab studies or crowdsourcing.
  • Past results/run deep analysis both statistically and fine grain based.
  • Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
  • Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
  • Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
  • Interactive Information Retrieval evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
  • Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
  • New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
  • Work on data from rare languages, collaborative, social data.