Om te solliciteren op deze stage, moet je met je HvA-emailadres een account aanmaken.

account aanmaken

Bedrijfsinformatie

Gerben ter Riet
Tafelbergweg 51, Room C1.18
1105 BD
Amsterdam
NTH
Onderzoek en Onderwijs

Automate monitoring compliance with Open Science principles


The replicability crisis in science shocked the scientific community. Why can so many study results not be replicated? The crisis raised an interest in Open Science, in which transparency and sharing of data and code are important. We formulated 15 open science-related criteria and we’d like to automate the monitoring of how well our research projects and researchers comply with these criteria.

Play it FAIR: automating the monitoring of compliance with Open Science principles

Background: The replicability crisis in science shocked the scientific community.1 Why is it that so few study results published in the highest impact journals cannot be repeated? The crisis has triggered an increasing interest in open science approaches in which transparency and sharing of data and code are key factors. We at the Amsterdam University of Applied Sciences (AUAS) run an eight-year research program with open science ambitions. We formulated 15 open science-related criteria (see bottom) and we’d like to automate the (annual) monitoring  of how well our research projects and researchers comply with these criteria. We hope to be able to show some progress in the remaining period up to 2025 and get indications along the way where progress is slow. As an example, a recent paper describes the tool SciScore that automatically checks methods sections in research papers for their adherence to certain methodologic rigor criteria.2

Objective: design, develop and test a high-performance and efficient (semi-)automated monitoring system that allows assessment of adherence to a carefully selected set of open science criteria in a research program spanning across three faculties.

Methods: critical review of the 15 open science indicators and, if needed, focusing on a most relevant subset. Design, development and testing of a smart and as much as possible automated tool to assess the extent of adherence (by researchers and/or projects) to the criteria. Design an attractive way to visualize adherence and its development over time

Results/products: Develop (i) a definitive set of open science criteria (see list below as a starting point); (ii) develop software a tool for automatic assessment of adherence; (iii) test the tool and refine and retest where necessary. Present the product to the user group and write your thesis or a scientific report for an international journal or platform.

Approximate time frame: 5-7 months

Month

Activity

Comment

1

Read up on open science backgrounds; minireview on existing tools3, decide on definitive indicators

Why are we doing this? Replicability crisis; selective outcome reporting (bias)

2-5

Design, develop and test tools, refine

Software development and testing

5-7

Report writing

 

 

References

1.       https://www.nature.com/articles/s41467-019-14203-0

2.       https://www.sciscore.com/

3.       https://www.zonmw.nl/fileadmin/user_upload/1_Automation_Report_FINAL.pdf

4.       https://osf.io/

5.       https://www.bihealth.org/en/research/quest-center/mission-approaches/open-science/quest-opening/


 

Open Science Checklist for ‘Mensen in Beweging’ Projects

This list of 15 questions was inspired by open science, REWARD movement and the replicability crisis

 

Developing the research objective and proposal

1.       What (lack of) evidence justifies your research? (systematic literature search; RP1)

2.       (How) did you involve end-users/professional practice? (RP1)

3.       Which reporting guidelines will you be using? (see e.g. equator-network; RP3)

 

Overall

4.       Do you(r team members all) have an ORCID?

5.       How do you ensure participants’ privacy (rights)?

 

Start

6.       Did you turn your proposal into a (series of) research protocol(s)?

7.       Did you write a (statistical) analysis plan (SAP)?

8.       Did you write a data management plan (DMP)?

9.       Did you preregister your protocol(s) and (S)AP? (RP3-4)

10.   (Where) did you obtain ethics approval?

 

During

11.   How do you ensure that your documents are (and remain) understandable and transparent (e.g. annotation of analytical code, use of (English) language, replicable, logging decisions)?

 

Output / End

12.   How do you ensure reporting on all outcomes specified in your protocol? (RP3-4)

13.   Which open access route for publishing your results do you anticipate?

14.   How do you prepare for FAIR-ly archiving and sharing your data (in Figshare)?

15.   (How) do you prepare for sharing syntax/code/software/artefacts in the public domain?

 

Glossary

RP= REWARD Pillar; 1=question; 2=methods and statistics; 3=reporting (see figure on page 2); SAP=Statistical analysis plan; DMP=data management plan; ORCID=persistent digital identifier that distinguishes you from every other researcher; FAIR=findable-accessible-interoperable-reusable;

= proof may be requested


Geschikt voor studenten
  • Software Engineering
  • Technische Informatica