Navigation auf uzh.ch
Liebe Angehörige des Psychologischen Instituts
Der vierte Newsletter der Open Science Initiative am Psychologischen Institut verspricht wieder spannende Lektüre zum Thema Open Science. Ein Schwerpunkt in diesem Newsletter ist ein Interview mit dem Editor in Chief von Swiss Psychology Open, Prof. Dr. Nicolas Rothen, sowie eine Zusammenstellung von Tipps und Tricks bzgl. Code Review. Ausserdem weisen wir auf die Gewinner*innen des Open Science Preises 2023 sowie bevorstehende Veranstaltungen hin.
Fragen, Anregungen und Beiträge zu diesem Newsletter nehmen wir gerne unter openscience@psychologie.uzh.ch entgegen – der nächste Newsletter wird Anfang des HS23 erscheinen.
Herzliche Grüsse
Ihre Open Science Initiative
Newsletterinhalt
When we submit our research reports to a journal, reviewers will typically not (only) point out the many strengths of our studies and the advances they make, but also dwell on the problematic aspects of the report. So far so good! Isn't it great to have these unpaid laborers point out spelling mistakes, inconsistencies in theorizing, and papers we forgot to cite? Perhaps they are even aware of a more rigorous type of analysis that we can run to increase the weight of our conclusions. But what if they find flaws in the design of our study or the measurement of constructs? At this point, we might regret not having conceived our research as a registered report (see our first Newsletter).
The Open Science Initiative has recently been asked for advice on where to submit a registered report to get valuable peer review before the research is undertaken. There are over 300 journals offering this submission format (see here https://www.cos.io/initiatives/registered-reports). One of the entries in this long list sparked our interest because it is not really a journal: The Peer Community In Registered Reports (PCI-RR, https://rr.peercommunityin.org). They describe themselves as a "a non-profit, non-commercial platform that publishes the peer-reviews of preprints". It turns out that this is a very interesting service because if they recommend your paper, selected journals will publish your paper without further review.
When you submit your registered report there, you will get peer review and the opportunity to revise the manuscript. When it is accepted, "the revised manuscript is posted at the preprint server where the preprint is hosted, and the peer reviews and recommendation of the preprint are posted at the PCI website. Authors then have the option to also publish the preprint in a traditional journal."
Well, that is odd – at the time of writing, the list of PCI RR-friendly journals includes only 28 journals, so many "traditional journals" are not (yet) on the list. We thought it would be good to talk to a representative of one of these progressive journals, and we're glad that Prof. Dr. Nicolas Rothen, Editor in Chief of Swiss Psychology Open (SPO, https://swisspsychologyopen.com), agreed to talk to us.
Open Science Initiative (OSI): Nicolas, is Swiss Psychology Open a traditional journal (*chuckles*)?
Nicolas Rothen: I would say “yes”, very much so. We publish scientific research articles like other journals in the field. We are probably a bit less traditional than other journals in the sense that we are a community-based journal. We are the official journal of the Swiss Psychological Society. As such, we are not interested in profit. Our primary aim is to support the researchers who build the scientific society more generally on an international level (i.e., not limited to the Swiss Psychological Society). With that we – an editorial team of idealists – are open to innovations driven by society to support best practices in research and publishing. This starts with the fact that we are an open access journal which publishes all articles under a CC-BY-4.0 license where all rights remain with the authors. We are aware of the replication crises and demand that all authors share at least their anonymized non-aggregated raw data. We are also aware that this is not enough and are thus supporting initiatives like the PCI-RR to sustainably change how we, as a society, conduct, report, and publish our research.
OSI: So, let's assume my registered report is recommended by PCI, will you publish it as is?
Nicolas Rothen: Yes, absolutely. As a PCI-RR-friendly journal we are committed to accepting the judgment of PCI-RR without performing additional peer review. Moreover, although not directly related to PCI-RR, we are currently in the great position that we can generously waive all fees for articles published in SPO (i.e., article processing charges, APCs), if none of the authors of an accepted manuscript has the funding to cover the charges.
OSI: From your point of view, what is the advantage of submitting a registered report to PCI-RR rather than directly to a journal?
Nicolas Rothen: In my opinion, the big advantage is that registered reports in PCI-RR are completely free of charge, in contrast to the exorbitant APCs of some very prestigious open access journals. That is, the manuscript is published on a pre-print server and the entire editorial correspondence including the final recommendation is published by PCI-RR. After this recommendation, the authors can select from a list of PCI-friendly journals, some of which are very generous in terms of waiving APCs, as for instance Swiss Psychology Open. With this model, there is less room for potential conflicts of interest. The editor (recommender in PCI-RR terms) is independent and does not act in the name of a journal, but in the name of a scientific society. That is, their decisions will be entirely based on good research practices. Moreover, PCI-RR is dedicated to inclusiveness and equity. To avoid potential biases articles can be submitted anonymously and reviews can be done anonymously. Another advantage is that the authors can select the journal after their article has been recommended for publication. This can reduce the time pressure to collect the data, because some journals have deadlines for the collection of the data after a manuscript has been granted “in principle acceptance” when registered reports are directly submitted to the journal.
OSI: Is there anything else you would like to add about SPO?
Nicolas Rothen: Indeed, as a society-based journal, we are always happy to receive suggestions to improve the publication experience of researchers in the field. For instance, we recently introduced scheduled reviews. This should speed up the initial review of a manuscript considerably. If you are planning on submitting a research article in 4-5 weeks, you inform us in advance about the submission date, including a one-page summary of your research and we will secure the reviewers before your submission.
Finally, I am convinced that society-based open access journals with fair APCs are the way to make a sustainable change when it comes to documenting our research.
OSI: Thank you very much!
Peer review of scientific work is a key part of quality control in science. Manuscripts submitted to scientific journals are routinely sent to other peer reviewers by journal editors. In contrast, checks of computational code or scripts for data analyses and for running computer software are still relatively rare in the social sciences. Code review refers to the process of systematically checking one’s own code or by someone else after it has been written for some goal. Computationally reproducible code is an important aspect of an open and transparent science and there is an increasing emphasis on learning the skills to code. There is also agreement that open data should come with understandable code to provide computational reproducibility. However, there hasn’t been as much emphasis on the skills to evaluate code. Perhaps one reason is that across most scientific disciplines, there is a lack of availability of analysis codes in the first place (for an overview, see Artner et al., 2021; Götz & O’Boyle, in press; Serghiou et al., 2021). It is still not very common to submit computational code together with manuscripts. Key questions that could be addressed during code review are the following: Is the code legible and clear? Does the code follow common good practices? Does the code do what was intended? Can an analysis be reproduced with this code? However, it is also important to understand what code reviewing does not mean, namely: debugging or providing statistical support or training in coding. These issues should be addressed separately. If you prepare a package for code review as an author, it will likely include a “README” file that describes the project (mentioning credits to specific persons and licensing); any outputs that the reviewers should try to reproduce; all data used to create the outputs to be reproduced; all code necessary to recreate the outputs; and a main script that runs any subscripts in the relevant order.
Lack of time, of expertise, or of incentives, might be reasons that researchers don’t ask others to review their research code. Indeed, a frequent issue appears to be the “embarrassment for others to see one’s own code”, according to Dr. Lisa DeBruine, a professor in the Institute of Neuroscience and Psychology at the University of Glasgow. De Bruine has recently given several presentations on the topic of code review and has initiated a “code-check club” that is open to everyone interested in the topic and who wants to join (see: https://github.com/code-check-club). Hopefully, this will help researchers to develop their skills and criteria to evaluate coding—and encourage people to share their codes with fewer feelings of embarrassment!
References
Artner, R., Verliefde, T., Steegen, S., Gomes, S., Traets, F., Tuerlinckx, F., & Vanpaemel, W. (2021). The reproducibility of statistical results in psychological research: An investigation using unpublished raw data. Psychological Methods, 26(5), 527–546. https://doi.org/10.1037/met0000365
DeBruine, L. (2023, February 25). Code-check club. Retrieved from: https://github.com/code-check-club
Götz, M., & O’Boyle, E. H. (2023). Cobblers, let’s stick to our lasts! A song of sorrow (and of hope) about the state of personnel and human resource management science. Research in Personnel and Human Resources Management, 41.
Serghiou, S., Contopoulos-Ioannidis, D. G., Boyack, K. W., Riedel, N., Wallach, J. D., & Ioannidis, J. P. A. (2021). Assessment of transparency indicators across the biomedical literature: How open is open? PLOS Biology, 19(3), e3001107. https://doi.org/10.1371/journal.pbio.3001107
Die Open Science AG hat nun zum vierten Mal Preise für die vorbildliche Umsetzung von Open Science Praktiken vergeben. Wir bedanken uns ganz besonders bei den Sponsoren UFSP Dynamik gesunden Alterns und psych alumni für die erneute finanzielle Unterstützung!
In der Kategorie ExPra ging der Open Science Preis 2023 (gestiftet von psych alumni) an die ExPra-Gruppe von Livia Heer, Denise Küttel, Ulysse Marendaz und Anna Sophie Rosch (Professur Sozialpsychologie) für ihren ExPra-Bericht mit dem Titel „Cross-Valence Inhibition in Evaluating Attributes of Everyday Situations“.
In der Kategorie Masterarbeit ging der Open Science Preis 2023 (gestiftet von psych alumni) an Nathalie Appenzeller (Professur Methoden der Plastizitätsforschung) für ihre Masterarbeit mit dem Titel "Forking a Way Through the Replication Crisis in Psychology: A Conceptual Replication of a Multiverse Analysis on Frontal Alpha Asymmetry in Depression on a Sample of Children and Adolescents“.
In der Kategorie Paper von Doktorierenden ging der Open Science Preis 2023 (gestiftet vom UFSP Dynamik gesunden Alterns) an Hannah Dames (Professur Allgemeine Psychologie (Kognition)) für ihre Publikation mit dem Titel «Directed forgetting in working memory».
In der Kategorie Paper von Postdoktorierenden ging der Open Science Preis 2023 (gestiftet vom UFSP Dynamik gesunden Alterns) an Sebastian Horn (Professur Entwicklungspsychologie: Erwachsenenalter) für seine Publikation mit dem Titel «Adult age differences in remembering gain- and loss-related intentions»
Übrigens: Neu können Bewerbungen für den Open Science Preis jederzeit eingereicht werden. Stichtag für die Berücksichtigung beim Open Science Preis 2024 ist der 31. Januar 2024. Weitere Informationen unter https://www.psychologie.uzh.ch/de/bereiche/open-science/preis.html
Dieser Newsletter erscheint einmal pro Semester. Kontakt bei Rückfragen zum Newsletter: openscience@psychologie.uzh.ch
Weil dieser Newsletter nur einmal pro Semester erscheint, können wir nicht auf kurzfristig anberaumte Veranstaltungen hinweisen. Wir empfehlen das Abonnement der Mailingliste des Center for Reproducible Science der UZH, um über Angebote zur Weiterbildung und zum wissenschaftlichen Austausch über Open Science an der UZH auf dem Laufenden zu sein, wie z.B. der ReproducibiliTea Journal Club.
Derzeitige Mitglieder der Open Science Arbeitsgruppe
Prof. Dr. Johannes Ullrich (Leitung); Dr. Walter Bierbauer; Prof. Dr. Renato Frey; Dr. Martin Götz; Dr. Sebastian Horn; Dr. André Kretzschmar; Prof. Dr. Nicolas Langer; M.Sc. Zita Mayer Dr. Susan Mérillat; Lic.-Phil. Jörg Schlatter; Prof. Dr. Carolin Strobl; Dr. Lisa Wagner; Dr. Katharina Weitkamp