While the focus of most research papers in the Semantic Web Community is on proposing novel ideas and approaches, we sometimes lack deeper but also wider evaluations of these ideas and approaches. The aim of the Replication Studies Track at ESWC 2021 is to provide a platform for presenting such evaluations of prior-published approaches in order to shed light on some important, possibly overlooked aspects of these approaches.
More concretely, a submission to this track may focus on a paper published recently in a Semantic Web conference or journal and ask whether the results are generalizable beyond the scope of the evaluation presented in the original paper. That is, the original paper may have made some restrictive assumptions on the evaluation datasets and protocol, and the replication study relaxes these assumptions now. For instance, the original evaluation may have been carried out on the DBpedia dataset and it is an open question how the proposed approach performs on other datasets. Another example are approaches that can, in principle, be applied independent of the language of the datasets, but the original evaluation was carried out only on English data.
Another type of submissions may look at earlier papers and ask whether the results are still valid given that the world has changed in the meantime. For instance, given that computers have become more powerful, what used to be a big dataset at the time of the original paper may be small by today’s standards; the character limit of Twitter has been increased to 280, while it was 140 during the time of some original paper that proposed a semantic tagging approach for these short messages.
Yet another type of submission may compare different approaches that have not been (properly) compared before. For instance, some approaches may have been evaluated only against weak baselines or weak competitors in the original papers, or two original approaches have not been compared with one another because they appeared around the same time, or relevant metrics have not been considered in the original comparisons.
Relevant criteria for submissions to this track are the following:
- The presented replication study is relevant and provides new insights about the studied approaches.
- The replication study has been conducted with scientific rigor and is presented accordingly; that is, the submission must include a detailed description of the experimental setup and of the observations, as well as an in-depth discussion of these observations. If the experimental setup deviates from the one in the original work, the submission has to thoroughly argue for the newly chosen setup. Hence, it is not sufficient to simply show that some approach or system does not work on your data or does not run on your machine; instead, you have to argue for the relevance and present insights that go beyond a simple “does not work.”
- The replication study has to be reproducible in itself (e.g., open code and data).
As a new theme in 2021, ESWC encourages the submission of negative results papers. Replication studies that challenge some previously accepted truism, expose limitations in the assumptions of existing work, or question the internal validity of the previously-published results fit very well into this theme. Specific instructions for negative results papers can be found here.
Delineation from Other Tracks
Submissions that focus on introducing new datasets or new benchmarks for evaluating existing (and future) approaches and systems should be reported as part of the Resources Track.
Information on deadlines and submission formats can be found here.
- Olaf Hartig, Linköping University
- Mauro Dragoni, Fondazione Bruno Kessler