

Abstract
This short module briefly introduces the concept of reproducibility, its importance, and the challenges in achieving it. You will also learn about the TIER2 project, its goals, and how it supports reproducible research practices.
-
Authors
Tony Ross-HellauerLanguage
EnglishKeywords
reproducibility, research integrity, TIER2 project, open scienceLicense
CC BY-SA 4.0 InternationalTarget audience
Researchers, research support staff, and other stakeholders interested in reproducible research practicesPrerequisites
NoneLearning outcomes
By the end of this module, learners will be able to:
- Describe the basic concept of reproducibility in research.
- Explain why reproducibility is important for scientific credibility and trust.
- Identify key challenges that make reproducibility difficult to achieve.
- Summarize the goals of the TIER2 project and how it supports reproducible research practices.

Authors
Sven Ulpts • Jesper W. Schneider
Language
English
Keywords
Epistemic diversity, relevance, feasibility, epistemology, reproducibility
License
CC BY-SA 4.0 International
Target audience
Funders, publishers, researchers, and anyone confronted with issues of reproducibility
Prerequisites
None
Abstract
This module examines the impact of epistemic diversity on reproducibility. It introduces the conceptual complexity surrounding reproducibility and related terms, explores how epistemic differences shape the feasibility and relevance of reproducibility across disciplines, and presents the Knowledge Production Modes (KPM) framework as a tool for assessing reproducibility in diverse research contexts. The module consists of three main parts and concludes with an assessment quiz.
Learning outcomes
By the end of this module, learners will be able to:
- Identify and understand the conceptual confusion surrounding reproducibility and related terms across and within disciplines.
- Analyze how different epistemic contexts affect the interpretation of reproducibility and its implications.
- Apply the Knowledge Production Modes (KPM) framework to assess the relevance and feasibility of reproducibility in diverse epistemic settings.

Authors
Eva Kormann
Tony Ross-Hellauer
Language
English
Keywords
Tools, practices, interventions, state of the evidence, open science, reproducibility
License
CC BY-SA 4.0 International
Target audience
Researchers, funders, publishers, research support staff, open science trainers
Prerequisites
None
Abstract
This module introduces key tools and best practices that enhance transparency and reproducibility across the research lifecycle. It provides an overview of the research process and highlights common sources of bias or error before presenting a suite of practical interventions—including preregistration, data management plans, open lab notebooks, open-source analysis tools (e.g., R, Python, Jupyter/Quarto, Docker), and data and code sharing practices.
The module also covers templates for documenting deviations from preregistration, reporting guidelines and checklists such as those offered by the EQUATOR Network, as well as preprints and good peer-review practices. A 20-minute video presentation and supporting materials illustrate how these tools apply across different epistemic contexts and summarise the current state of evidence on which interventions are most effective in strengthening research integrity and accountability.
Learning outcomes
By the end of this module, learners will be able to:
-
Understand key concepts – Explain the importance of transparency and reproducibility in research and their role in supporting scientific integrity.
-
Identify tools and practices – Recognise a range of practices that promote transparency, such as preregistration, data sharing, and reporting guidelines.
-
Evaluate effectiveness – Assess current evidence on which reproducibility-enhancing interventions most effectively improve research quality.
-
Apply best practices – Describe how to implement practical tools, including preregistration templates and transparent reporting checklists.
-
Make informed decisions – Select and apply appropriate tools and workflows to improve rigor, accountability, and openness in their own research.

Abstract
This module provides participants with practical skills to implement reproducibility tools and strategies in research or institutional workflows. It is structured around TIER2’s seven pilots, including: Reproducibility Management Plans (RMPs); Reproducible Workflows (life & computer sciences); Checklists for Computational Social Science; Reproducibility Promotion Plans for Funders (policy templates); a Reproducibility Monitoring Dashboard (tracking reusability of outputs); Editorial Workflows to improve data sharing; and an Editorial Reference Handbook for Reproducibility and FAIRness (publisher checks). It emphasizes hands-on exercises/discussions to help learners apply tools and foster transparency and reliability in research
Authors
Eleni Adamidi; Panagiotis Deligiannis; Nikos Foutris; Thanasis Vergoulis; Fakhri Momeni; Sarah Sajid; Joeri Tijdink; Barbara Leitner; Alexandra Bannach-Brown; Friederike Elisabeth Kohrs; Petros Stavropoulos; Stefania Amodeo; Haris Papageorgiou; Thomas Klebel; Eva Kormann; Matthew Cannon; Allyson Lister; Susanna-Assunta Sansone; Rebecca Taylor-Grant
Language
English
Keywords
reproducibility; computational reproducibility; editorial guidelines; FAIR principles; monitoring dashboard; open science; policy development; reproducibility checklists; reproducibility management; reproducible workflows; research evaluation; research integrity; research transparency; scientific publishing
License
CC BY-SA 4.0 International
Target audience
Researchers, research organizations, funders, publishers
Prerequisites
None
Learning outcomes
By the end of this module, participants will be able to implement tools and practices developed or extended through the seven pilots of the project, to enhance research reproducibility.”

Authors
Joeri Tijdink
Barbara Leitner
Language
English
Keywords
open science, reproducibility, replication, qualitative research
License
CC BY-SA 4.0 International
Target audience
Funders, funding institutions
Prerequisites
None
Abstract
This module provides an overview of a funder-focused policy document offering practical recommendations to promote reproducibility. It explains how funders can adopt clear definitions, create incentive structures, and implement evaluation and monitoring processes that strengthen reproducibility practices within their funding programmes. It also summarises best practices gathered through stakeholder workshops and introduces a reproducibility promotion plan designed to support funders in improving research quality and accountability.
Learning outcomes
By the end of this module, learners will be able to:
- Recognise the importance of reproducibility for funders, including its role in improving trust, research quality, and return on investment.
- Implement key recommendations that help funders embed reproducibility into policies, incentives, and monitoring workflows.
- Develop a provisional plan for enhancing reproducibility within their organisation, drawing on lessons learned from pilot funder collaborations.

This module introduces publishers to the Editorial Reference Handbook, a practical resource collaboratively developed by academics and publishers. The handbook assists in-house editorial staff to operationalise a set of checks fostering good practices for sharing datasets, software, materials and other digital objects.
By the end of this module, publishers will:
- Know what checks to perform and how to implement them in practice
- Learn how to improve clarity of data policies and guidance to authors (especially in terms of which standards and repositories to use)
- Gain practical guidance on making Availability Statements clearer and more rigorous

This module examines reproducibility challenges in qualitative research, covering methodological, epistemological, and practical considerations.
The module consists of four parts and ends with an assessment quiz:
- Part 1: Reproducibility and Qualitative Research
- Part 2: Open Science and Qualitative Research
- Part 3: How to do Open Qualitative Research
- Part 4: How to Support Open Qualitative Research
This module is based on the review paper Cole, N. L., Ulpts, S., Bochynska, A., Kormann, E., Good, M., Leitner, B., & Ross-Hellauer, T. (2024, December 23). Reproducibility and replicability of qualitative research: an integrative review of concepts, barriers and enablers. https://doi.org/10.31222/osf.io/n5zkw_v1
By the end of this module, learners will be able to:
- Understand the relationship between reproducibility and qualitative research
- Understand the relationship between qualitative research and open science practices
- Know which open science practices are possible for qualitative research
- Know how established qualitative research practices support transparency

This learning module explains the critical relationship between reproducibility and the trustworthiness of artificial intelligence (AI). It emphasizes the understanding of how reproducibility impacts credibility and explains the different levels of AI reproducibility that researchers may strive for. The module also elaborates on barriers, such as inconsistent data collection and lack of transparency, as well as drivers, like standardized practices and tools that can support reproducibility. Finally, it illustrates how the presented barriers and drivers interact and aims to foster an understanding of this interaction in order to enhance the reproducibility of AI systems, thereby leading to a more reliable and valuable research practice in the field of AI.
By the end of this module, learners will be able to:
- Understand the relationship between reproducibility and the trustworthiness of AI
- Know the different levels of AI reproducibility
- Be aware of Barriers and Drivers of reproducibility
- Understand the relation of barriers, drivers, and degree of AI reproducibility