Research in personalized information retrieval (PIR) has generally been evaluated using small scale user studies. This approach greatly limits the scope for comparative evaluation of alternative methods for exploiting information about the users and their behaviour in adapting search to their needs.
The PIR-CLEF 2017 workshop at the CLEF 2017 Conference will bring together researchers working in PIR and related topics to explore the development of new methods for evaluation in PIR.
Since this is a pilot activity we encourage participants to attempt the task using existing algorithms and to explore new ideas. We also welcome papers examining details of the task and the dataset.
PIR-CLEF 2017 invites participation in two distinct ways:
- Paper submission http://pirclef-2017.disco.unimib.it/papers-submission/
- Participation in a Pilot Task on Personalized Search – registered participants will receive the data collection via email. Registration at http://clef2017-labs-registration.dei.unipd.it/
– Abstract submission: June the 16th
– Paper submission: June the 28th
– Notification of acceptance: June the 30th
– Camera-ready version: July the 3rd (strict deadline!)
– Workshop date: September September the 12th, 2017
– Labs registration opens: November 4, 2016
– Registration closes: April 21, 2017 Registration at http://clef2017-labs-registration.dei.unipd.it/
The CLEF Conference addresses all aspects of Information Access in any modality and language. The CLEF conference includes the presentation of research papers and a series of workshops presenting the results of lab-based comparative evaluation benchmarks. CLEF 2017 is the 8th year of the CLEF Conference series, and the 18th year of the CLEF initiative as a forum for information retrieval (IR) evaluation. The CLEF conference has a clear focus on experimental IR as carried out within evaluation forums (CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, TAC, …) with special attention to the challenges of multimodality, multilinguality, and interactive search.