Call for Reproducibility Papers


ECIR strongly encourages the submission of papers that repeat, reproduce, generalise, and analyse prior work. Please refer to the ACM “Artifact Review and Badging” guidelines for consistent use of the terminology, which is heterogeneous across disciplines.

In particular, we solicit replicability (different team, same experimental setup) and reproducibility (different team, different experimental setup) papers. Submissions from the same authors – i.e., repeatability (same team, same experimental setup) papers – of the reproduced experiments will not be accepted.

Reproducibility is key for establishing research to be reliable, referenceable, and extensible for the future. Emphasise your motivation for selecting the paper/papers, the process of how results have been attempted to be reproduced (successful or not), the communication that was necessary to gather all information, the potential difficulties encountered, and the result of the process. A successful reproduction of the work is not a requirement, but it is crucial to provide a precise and rigid evaluation of the process to allow lessons to be learned for the future.

Submission Guidelines

  • Reproducibility papers can be up to 12 pages in length plus additional pages for references. Appendices count toward the page limit.
  • Reproducibility papers will be refereed through double-blind peer review, with an initial first stage review followed with a second stage of discussion led by a meta-reviewer.
  • Authors are asked to upload their software artifacts either (i) to an anonymous repository that is linked to in the submission or (ii) to Easychair as part of the submission.

Authors should consult Springer’s author guidelines and use their proceedings templates, either for LaTeX or Word for the preparation of their papers. The templates are available at  Springer encourages authors to include ORCIDs in their papers ( 

All submissions must be written in English. All papers should be submitted electronically via EasyChair:


In addition, the corresponding author of each accepted paper, acting on behalf of all of the authors of that paper, must complete and sign a Consent-to-Publish form. The corresponding author signing the copyright form should match the corresponding author marked on the paper. Once the paper has been submitted, changes relating to its authorship cannot be made.

Accepted papers will be published in the conference proceedings in the Springer Lecture Notes in Computer Science series. The proceedings will be distributed to all delegates at the conference. Accepted papers are to be presented in-person at the conference – and at least one author will be required to register.

Ethics and professional conduct

ECIR 2024 expects authors (as well as the PC, and the organising committee) to adhere to accepted standards on ethics and professionalism in our community,  namely:

Dual submission policy

Papers submitted to ECIR 2024 should be substantially different from papers that have been previously published, or accepted for publication, or that are under review at other venues. Exceptions to this rule are:

  • Submission is permitted for papers presented or to be presented at conferences or workshops without proceedings.
  • Submission is permitted–though discouraged as it places anonymity at risk–for papers that have been made available before or during the reviewing process of ECIR 2024 as a technical report (e.g. in institutional archives or preprint archives like arXiv). If a technical report is available, we advise you (i) to not use the same title and abstract for your ECIR submission (in case of acceptance this can still be changed back); (ii) to not cite your technical report and (iii) to make an effort to avoid any issues that may harm the double-blindness of your submission.

Review Criteria

All reproducibility-track papers will be evaluated along the following criteria (when applicable):


  • Were there key practical information (algorithms, parameter settings, software libraries, data collections) not reported in the original paper?
  • Was the original work not supported from the theoretical point of view?
  • Were the original experiments not clear about important points or lacking confirmation for some of the original claims?
  • Are there new baselines and experiments presented in the reproduced paper?
  • Is the reproduced paper proposing new evaluation criteria (new measures, statistical tests, etc.)?


  • How important is the reproduction of the experiments to the community?
  • How obvious are the conclusions achieved?
  • Do the reproduced prior works, if validated, advance a central topic to information retrieval (a topic with broad applicability or focused on a hot research area)?
  • What is the impact of the original paper? Is it central or marginal to the community?


  • Is the evaluation methodology in line with the research challenges addressed by the reproduced experiment?
  • Are the selected baselines representative of the several algorithm types and techniques available?
  • Is the parameter/hyperparameter setting properly described?
  • Are algorithms and baselines adequately tuned?


  • Are the code and datasets used to reproduce the experiments available to the reviewers at the time of review?
  • Is the shared material released in a permanent repository for easy access by researchers?
  • Are the reproduced experiments well documented, with all the details required for other researchers to reproduce the experiments?
  • Are there discrepancies between what is described in the paper and what is available in the shared material?
  • Is the shared material complete with everything you need to replicate the experiments exactly?


Reproducibility paper track dates

  • Reproducibility paper abstract submission: October 4, 2023, 11:59pm (AoE)
  • Reproducibility paper submission: October 11, 2023, 11:59pm (AoE)
  • Reproducibility paper notification: December 14, 2023
  • Main conference: March 25-27, 2024

Reproducibility paper track chairs

  • Claudia Hauff (Spotify and TU Delft, Netherlands)
  • Hamed Zamani (University of Massachusetts Amherst, USA)
  • Contact: ecir24-reproducibility at