In our blog, we once in a while publish controversial ideas of which we believe that a discussion may advance the recommender-system community.
These blog posts a as follows.
SIGIR 2020 is the premier venue for research in the field of Information Retrieval. I am typically on the program committee at SIGIR. Recently, i.e. since 2 years or so, I had the feeling that submissions relating to recommender systems strongly increased at SIGIR, especially this year. My impression seems to be confirmed today: According […]
The data mining community has it, the information retrieval community has it, and the machine learning & AI community has it: (top) journals. It seems that only the recommender-system community has no journal, exclusively for its community. Top recommender-system articles beyond the ~8-page limit for the ACM RecSys Conference hence are scattered across many different […]
The recommender-system community is facing a reproducibility crisis. This has recently been demonstrated by the authors of the paper Are we really making much progress? A worrying analysis of recent neural recommendation approaches (Maurizio Ferrari Dacrema, Paolo Cremonesi, Dietmar Jannach). However, the crisis is not new, and has been recognized (at least) a decade ago […]
Posted in Controversial Ideas & Discussions, Evaluation & Reproducibility
Tagged Dietmar Jannach, Gediminas Adomavicius, Guidelines, Joseph A. Konstan, Maurizio Ferrari Dacrema, Michael Ekstrand, Michael Ludwig, NeurIPS (NIPS), Paolo Cremonesi, Papers with Code, publishing research code
The number of submissions and participants at the ACM Conference on Recommender Systems is increasing, year by year. Sometimes, the conference is even sold out or moves short-notice to a new larger venue. Acceptance rates decreased from 46% in its first year (2007) to below 20% in recent years. Among the rejected papers, there are […]