Cross-Sectional Evaluation of Open Science Practices at Imaging Journals: A Meta-Research Study

Objective: To evaluate open science policies of imaging journals, and compliance to these policies in published articles. Methods: From imaging journals listed we extracted open science policy details: protocol registration, reporting guidelines, funding, ethics and conflicts of interest (COI), data sharing, and open access publishing. The 10 most recently published studies from each journal were assessed to determine adherence to these policies. We calculated the proportion of open science policies into an Open Science Score (OSS) for all journals and articles. We evaluated relationships between OSS and journal/article level variables. Results: 82 journals/820 articles were included. The OSS of journals and articles was 58.3% and 31.8%, respectively. Of the journals, 65.9% had registration and 78.1% had reporting guideline policies. 79.3% of journals were members of COPE, 81.7% had plagiarism policies, 100% required disclosure of funding, and 97.6% required disclosure of COI and ethics approval. 81.7% had data sharing policies and 15.9% were fully open access. 7.8% of articles had a registered protocol, 8.4% followed a reporting guideline, 77.4% disclosed funding, 88.7% disclosed COI, and 85.6% reported ethics approval. 12.3% of articles shared their data. 51% of articles were available through open access or as a preprint. OSS was higher for journal with DOAJ membership (80% vs 54.2%; P < .0001). Impact factor was not correlated with journal OSS. Knowledge synthesis articles has a higher OSS scores (44.5%) than prospective/retrospective studies (32.6%, 30.0%, P < .0001). Conclusion: Imaging journals endorsed just over half of open science practices considered; however, the application of these practices at the article level was lower. Visual Abstract This is a visual representation of the abstract.


Introduction
Open science refers to a movement to make research and its processes freely available to access, read, and build upon.
2][3][4] There are many advantages to implementing open science practices.For example, when research is published open access, the financial barrier to access is eliminated, allowing readers who can't afford journal subscriptions or who are not a part of an affiliated institution to access it. 1,4When research is open access, it increases its visibility and spread since more people can download and read the articles. 1,4Furthermore, open science practices such as study registration and data sharing increase the transparency of research methods which enables critical appraisal and replication. 1,4n August of 2022, the White House's Office of Science and Technology Policy (OSTP) stated that US federal agencies are required to make all tax-payer funded publications free for public access 5 ; this is a demonstration of the push toward open science practices.In addition, the government of Canada has begun to implement recommendations and action plans to ensure that open science practices are being implemented in the research community. 6Their objective is to provide recommendations and guidance regarding open access practices.The reason for the push toward open science practices, according to the program, is that open science accelerates knowledge transfer, increases reproducibility and leverages diversity and inclusion. 6Their first recommendation is "Canada should adopt an open science approach to federally funded scientific and research outputs," highlighting the increased attention surrounding the open science concept.
The Transparency and Openness Promotion (TOP) guidelines, 7 an initiative started by the Center for Open Science in 2015, evaluates journals based on their degree of compliance with transparency and openness standards.This organization provides a score, called "Top Factor," from 0 to 3, with 3 being the best, for many open science categories based on their level of adherence.The journals are evaluated based on citation standards, data transparency, analytic methods transparency, research materials transparency, design and analysis transparency, study registration, analysis plan registration, and replication. 8By evaluating journals using open science metrics, this initiative is facilitating the push for transparency and improves research communication.
Despite the increasing attention toward open science practices, these are not yet translating into universal practice.A study by Ebrahimzadeh et al 9 audited the open science and data sharing practices of the Montreal Neurological Institute-Canada's first self-proclaimed Open Science institute-and concluded that the authors of only about half of all publications shared data.Rates at institutions less formally committed to open science are likely even lower.A study by Nutu et al 10 evaluated open science practices of high impact psychology journals and found very low adherence to prospective registration (3%) and data sharing (2%).Moreover, in imaging research, a paper by Hong et al 11 evaluated the adherence of diagnostic test accuracy (DTA) studies to reporting guidelines and revealed that only 55% of papers adhered to the STAndards for Reporting of Diagnostic accuracy studies.In addition, a study conducted by Salameh et al 12

Transparency Statement, Ethics, and Protocol Registration
The study protocol, data, analytical code, and materials associated with the project is available on our Open Science Framework project page (https://osf.io/gzv46).There were no major protocol deviations.Our institution research ethics board approval was not required for this evaluation of published research.

Reporting Guideline
This study was reporting using the STROBE 13 reporting guideline for cross sectional study design as guidance.A completed checklist can be found in Appendix 1.

Journal and Article Selection
We identified a list of radiology and nuclear medicine journals using Web of Science. 14The full list of search terms used to identify the journals can be found in Appendix 2a.From the identified journals, we only included journals that were in English and published empirical/primary research.We excluded journals that were related purely to radiation oncology, radiation therapy, or experimental physics as our focused is on clinical imaging.We also excluded journals that were not in English or that strictly published review studies (Appendix 2b).

Journal Data Extraction
Data extraction was conducted using Microsoft Excel.Data extraction was completed by in duplicate.Pilot data extraction was conducted on a sample of journals and articles.The authors that were involved in data acquisition were MK, PR, AK, NI, HA, HD, RA, and MZ.All conflicts were resolved by consensus or, when necessary, third-party arbitration.In addition, MM is the only radiologist from the author list.
From the included journals, we extracted (independently and in duplicate) information from the "instructions for authors" subsection of the journals' website, in addition to any other available relevant information on the website (eg, journal policies).
The open sciences policies that were extracted included: requirements for study registration, use of reporting guidelines, ethical practice, data sharing, open access policies, and TOP factor.The definitions of the variables evaluated in this paper can be found in Table 3.For authors who specify that their primary data is available upon request, we contacted them through email and requested their data.We provided the authors a 14 day period, from the time of the email, to share their data.Open access policies include the option to publish an article to be made available free of charge for the read.In addition, we extracted the article processing charges (APC) amount in US dollars and if there were any methods to reduce the APC (eg, APC discount for low-income countries).The TOP factor is a metric that evaluates the level of compliance of a journal to open access practices.We extracted TOP factor scores directly from their website. 8These open science categories were derived from similar auditing studies, 9,10 the TOP guidelines, 7 and previous literature guiding open science practices. 1,3,8A complete list of extraction parameters can be found in Appendix 3a.Data sources will include journal and publisher web page, instructions for authors and any other information identified from these sources.Data sources included journal and publisher web page, instructions for authors and any other information identified from these sources.This data was extracted during February and March of 2023.

Article Data Extraction
Following journal data extraction, for each included journal, we conducted a cross-sectional audit of the 10 most recently published empirical research articles.We chose December 31st of 2022 as the latest date then worked backwards to screen for articles.The article PDFs were retrieved directly from PubMed if the article was open access, through the University of Ottawa Library, or using an inter-library loan with the help of a librarian.From the journal extraction form, we extracted related information and evaluated compliance with each journal's policies.The articles were retrieved as PDFs from the journal's website.We extracted information on the same open science practices obtained at the journal level from the articles obtained.The extraction form can be found in Appendix 3b.We also categorized the articles based on the study design as different types of studies may have different requirements or adhere to open science practices at different rates.The 3 types of study design categories were prospective studies (eg, randomized controlled trial), retrospective observational studies (eg, case-control studies), and systematic reviews.A pilot test for the data extraction step was conducted on 20 articles prior to study commencement.

Open Science Score (OSS)
To quantify open science practices and the performance of each journal and article, we created a journal or article Open Science Score (OSS) (Appendix 4).The OSS combined numerous data points that we extracted, for each journal or article.The included variables are all dichotomous.A score of 1 was given for each "Yes."A score of 0 was given for "No" or "Not Mentioned."We calculated the OSS for each journal and article.In addition, a sample calculation can be found in Appendix 4b.Each component open science had an equal weight in the equation; however, ethical practices was evaluated using a number of sub-variables, so we decided to combine all the sub-variables of ethical practices into a weight that is equal to all the other variables.The rationale for weighing it in this manner was because ethical practice, as an open science practices, requires evaluation of many subcomponents, which must be combined into one score of equal weight to other open science practices, to allow the Open Science Score to represent a balanced grade.
The journal OSS included 6 variables: registration, use of reporting guideline, ethical practices, data sharing policy, the presence of an APC waiver, and whether the journal uses an open peer review system.
The article OSS included 5 variables: registration, use of reporting guideline, ethical practices, data sharing, and whether the article was available through open access.The ethical practices included 3 sub-variables, each accounting for 1/3 of the ethical practices score.The 3 sub-variables are disclosing project findings, conflicts of interest, and whether authors mentioned whether receiving board approval to conduct their study.

Data Analysis
For our secondary objective analysis, we compared the relationship between specific journal characteristics (journal impact factor, category of journal [Open access, hybrid, traditional], % of open access articles) and the OSS.For the category of journal compared to the level of open science recommendation, we used the Pearson chi-square test for independence.For journal impact factor and % of open access documents compared to the level of open science recommendation, we used the Pearson Correlation Test.In addition, we evaluated the study design of an articles compared to the article OSS using a Tukey Honest significance test.Data analysis was performed using R Studio and SAS Analytics software.

Results
A total of 82 journals were included; 53 journals were excluded that were related purely to radiation oncology, radiation therapy, or experimental physics.We also excluded journals that were not in English or that strictly published review studies.In addition, a total of 820 articles were included representing 10 articles for each included journal.Complete list of included journals and extracted data is provided as Supplemental Appendix.

Journal Results
The average OSS percentage for all journals was 58.3%.A list of individual OSS for each journal can be found in the Supplemental Material.The two journals that received a perfect OSS were BMC Medical Imaging and the Iranian Journal of Radiology.At the journal level, 15.9% of journals were fully open access and the percentage of open access documents across all journals was 48.0%.In terms of preprints, 51.2% of journals allowed submissions of articles that have been posted as preprints, 3.7% did not allow preprints, and 45.1% did not mention their policy regarding preprints.The median APC value was 3319 USD.The results of all other variables, at the journal level, can be found in Table 1 and Figure 1.

Article Results
A total of 820 articles (10 per journal) were included in this study.The average Open Science Score (OSS) percentage for all articles was 31.8%.A list of individual OSS for each article can be found in the Supplemental Material.Sixty-four (7.8%) articles registered their study protocol.Of those, 12 (1.5%)articles registered their protocol for a systematic review on registries, such as Open Science Framework and PROSPERO.The remaining 52 (6.3%) were clinical trials registrations, most of which were registered on "ClinicalTrials.gov."In terms of data sharing, 37.4% of articles included a data availability statement; 3.3% of articles included all raw data of their study in the paper itself; 6.5% of articles use data that is publicly available as their raw data and (27.32%) of articles mentioned stated that their data is available upon request.After emailing the corresponding authors and asking them to provide their data within a 14 day period, only 9.4% of 224 authors responded and provided their data.Moreover, 46.6% of all articles were available through open access.
4.6% of all articles were posted as a preprint (Appendix 5).
The results of all other variables, at the article level, can be found in Table 2 and Figure 2.

Secondary Analysis
There was no correlation (Pearson's correlation coefficient of −.036) between impact factor and journal OSS (P-value = .75).
The average journal OSS with a DOAJ membership was 80% compared to 54.2% for journals without a DOAJ membership (P-value <.0001).There was a difference in OSS within one of the 3 different study design groups (retrospective, prospective, synthesis).The synthesis group had an average OSS of 44.5%, prospective group had 32.6%, and retrospective group had 30.0%.There was a difference in mean OSS between synthesis-prospective (P < .0001)and synthesis-retrospective (P < .0001),but no difference between retrospective-prospective (P = .076).Note: All variables had an n-value of 820 unless otherwise specified."Registration" refers to which articles prospectively registered their study protocol."RG" and "RC" Indicate which articles followed a reporting guideline, or submitted a reporting guideline checklist, respectively."Funding," "COI," "Ethics Board Approval" reveal whether articles disclosed sources of funding, conflicts of interest, and approval from an Ethics Research Board, respectively."Upon Request*" reveals which authors provided their primary data after being contacted."Open Access" indicates which articles are accessible for free."Preprint" indicates which articles were posted as a preprint prior to publication."Registration" refers to which articles prospectively registered their study protocol."RG" and "RC" indicate which articles followed a reporting guideline, or submitted a reporting guideline checklist, respectively."Funding," "COI," "Ethics Board Approval" reveal whether articles disclosed sources of funding, conflicts of interest, and approval from an ethics research board, respectively."Upon Request*" reveals which authors provided their primary data after being contacted."Open Access" indicates which articles are accessible for free."Preprint" indicates which articles were posted as a preprint prior to publication.
*This variable had an N-value of 224.

Registration
Study registration is when a study protocol, or information about the execution of the study, is publicly submitted, published, or made available prior to conducting the project. 15

Reporting guideline
A reporting guideline, according to Enhancing the Quality and Transparency of Health (EQUATOR) Network, 16 is a "simple, structured tool for health researchers to a use while writing manuscripts.
A reporting guideline provides a minimum list of information needed to ensure a manuscript can be, for example, understood by a reader, replicated by a researcher, used by a doctor to make a clinical decision, and included in a systematic review."

Ethical Practice
Good ethical practices, include but are not limited to, a Committee on Publication Ethics (COPE) membership, plagiarism checks, and policies requiring disclosure of any conflicts of interests, funding, and receiving ethical board approval.

Committee of publication ethics (COPE)
The mission of the organization is to educate and spread awareness of sound ethical practices to authors and journals.

Data Sharing
Good ethical practices, include but are not limited to, a Committee on Publication Ethics (COPE) membership, plagiarism checks, and policies requiring disclosure of any conflicts of interests.Data sharing practices were assessed by checking whether the data used for the analyses in the study were posted in an online repository or send to us upon request.

Article processing charges (APC)
An Article processing charge is a fee paid by authors to journals in order to publish an article through open access. 17

Open Peer Review
This is review process by journals that involves transparent reporting of authorship, referees involved and editorial decisions. 18

Discussion
Imaging journals endorsed just over half of evaluated open science practices, reflecting an OSS of 58.3% among the 82 imaging journals we evaluated.The 820 recently published research articles from these journals had a lower compliance with open science practices reflected by OSS of 31.8%.
Previous research on open science practices provides context for these results and allows us to compare and contrast imaging research to other fields.Nutu et al 10 evaluated open science practices in psychology journals and found that 25% of journals had a registration policy, 67% of journals had a data sharing policy, 47% of journals endorsed reporting guidelines, and 87% of journals required authors to disclose conflicts of interest.In addition, at the article level, only 3% of studies were prospectively registered, 2% shared their data, and 50% disclosed their conflicts of interest.In every metric, both at the journal and article level, our study revealed a higher level of open science practices.This could be interpreted as a sign that imaging journals are more adherent to open science practices than psychology journals.However, their study was published 4 years ago indicating that differences may be related to progress of open science practices over time.
Siebert et al 19 evaluated data sharing recommendations in biomedical journals and randomized clinical trials and found that 30% of ICMJE affiliated journals had explicit data-sharing policies and 22% of articles expressed intentions to share data.In comparison, our study found that 81.7% of imaging journals had a data sharing policy and we retrieved primary data from 12.3% of articles.The lower rate of data sharing in our cohort may be related to more stringent definition applied in that we required authors to share data upon request rather than simply stating that they would in the paper.
Hong et al 11 showed that only 4.9% of imaging diagnostic accuracy studies had pre-registered protocols.Our cohort had a similarly low proportion of registered studies indicating that further progress in imaging research protocol registration remains necessary.Studies in other fields such as behavioral medicine 20 have found similarly low rates of protocol registration (2%-12%).
A limitation of this study was that open science practices were difficult to quantify.At the time of conducting this study, there was no agreed upon standard in the literature on how to best evaluate open science practices.In addition, extracting data on journal policies was a subjective process.To combat these related issue, we limited our data extraction questions to categorical answer options, and we created a formula to output an overall numeric score for open science practices.In addition, we conducted all data extraction in duplicates, to limit subjectivity of our data.Another limitation was that we were only able to include 10 articles per journal for feasibility reasons.To combat this issue, all of our analysis compared articles versus journals cumulatively, and we did not conduct any analysis that stratified per journal, in which the sample number would have been too small (n = 10).Also, we excluded the journal titled "Diagnostic & Interventional Radiology" inadvertently due to an error in the screening process.In retrospect, this journal should have been included in our analysis as it met our eligibility criteria.This limitation should be considered while interpretating the results of this study; however, we anticipate the impact of this limitation should be minimal as we included 82 journals.
Educational programs, at academic institutions or through virtual events, that inform authors about the benefits of open science practices, such as protocol registrations and the use of reporting guidelines, may increase the implementation of these practices.In addition, higher publication standards at the journal level may increase the use of open science practices at the article level.The main initiative that is spreading awareness about this topic is the TOP.They provide a guideline for what good open science practices at the journal level looks like, as well as auditing a large number of journals on open science practices.
Social media plays a big role in promoting published studies, especially in the field of radiology. 21A study conducted by Pozdnyakov et al 22 explores the visibility benefits of posting research articles.Their discussion further supports the need for open science promotion, as this will make it easier to discuss research content on social media, provides further free reading for the general public, and increases visibility of published studies.
Future research should continue to audit the implementation of open science practices, both at the journal and article level, and evaluate whether there is improvement or regression.In addition, since open science is a relatively new concept, more methods of quantifying and evaluating open science should be explored, in addition to our Open Science Score, in order to have a standard and validated method of evaluation.Lastly, more studies that evaluate open science as a whole, as opposed to one or 2 components, would provide more context on the current state of open science across different areas of research.-Was/is this article posted as a pre-print?-If yes, which preprint server was it posted on (ie, medRxiv)?

Journal Open Science Score Equation
The journal open science score was calculated using Equation 1.A score of 1 was given to each variable if the criteria was met (yes = 1, no = 0).The final score is a percentage.

Article Open Science Score Equation
The article open science score was calculated using Equation 2. A score of 1 was given to each variable if the criteria was met (yes = 1, no = 0).The final score is a percentage.
showed that recent diagnostic test accuracy systematic reviews only included two thirds of the relevant reporting guideline items, which made the studies "not fully informative."These examples indicate a low level of implementation or adherence to open science practice.The primary objective of this study was to evaluate open science practices in imaging research.This included evaluating the open science policies of individual journals, as well as the rate of adherence to these policies in a sample of recent individual studies published in these journals.The secondary objective was to evaluate for associations between specific journal characteristics, such as journal impact factor (from 2021), category of journal (open science or hybrid), % of open access documents, and compare it to open science practice adherence in the articles.

Figure 1 .
Figure 1.Proportions of open science practices at the journal level.All variables had an N-value of 82 unless otherwise specified."FullOpen Access" indicates which journals only publish open access articles."Registration" refers to which journals had a prospective registration policy."RG" and "RC" indicate which journals had a policy that required authors to follow a reporting guideline, or submit a reporting guideline checklist, respectively."COPE" refers to which journals had a membership with the Committee On Publication Ethics."Funding," "COI," "Ethics Board" reveal whether journals required disclosure of funding, conflicts of interest, and approval from an ethics research board, respectively."APC Waiver" indicates which journals provided a waiver for articles processing charges associated with open access publishing.

Figure 2 .
Figure 2. Proportions of open science practices at the article level.All variables had an N-value of 820 unless otherwise specified."Registration"refers to which articles prospectively registered their study protocol."RG" and "RC" indicate which articles followed a reporting guideline, or submitted a reporting guideline checklist, respectively."Funding," "COI," "Ethics Board Approval" reveal whether articles disclosed sources of funding, conflicts of interest, and approval from an ethics research board, respectively."Upon Request*" reveals which authors provided their primary data after being contacted."Open Access" indicates which articles are accessible for free."Preprint" indicates which articles were posted as a preprint prior to publication.

Table 1 .
Proportions and Percentages of Open Science and Data Sharing Practices at the Journal Level.
Note.All variables had an n-value of 82 unless otherwise specified."Full Open Access" indicates which journals only publish open access articles."Registration" refers to which journals had a prospective registration policy."RG" and "RC" indicate which journals had a policy that required authors to follow a reporting guideline, or submit a reporting guideline checklist, respectively."COPE" refers to which journals had a membership with the Committee on Publication Ethics."Funding," "COI," "Ethics Board" reveal whether journals required disclosure of funding, conflicts of interest, and approval from an Ethics Research Board, respectively."APC Waiver" indicates which journals provided a waiver for articles processing charges associated with open access publishing.

Table 2 .
Proportions and Percentages of Open Science Practices at the Article Level.
*This variable had an N-value of 224.

Table 3 .
Relevant Open Science Terms and Definitions.