TOP Guidelines

TOP Guidelines

Public Administration Review endorses the principles of openness and transparency in scientific research and is a signatory of the Center for Open Science’s Transparency and Openness Promotion Guidelines.  Consequently, PAR has adopted four transparency guidelines for authors.  Further detail and how-to guidance are provided in the “Frequently Asked Questions” section on the PAR website at https://publicadministrationreview.org/

1) Public Administration Review values original data collection.  To help those who collect data gain appropriate recognition for their work, authors must cite (in text) and reference in the bibliography all research materials, including: data (quantitative or qualitative) collected by others, survey instruments or program code written by others.  Please see the “Frequently Asked Questions” section on the PAR website for guidance.

2) Public Administration Review encourages authors to share their data (quantitative or qualitative) and/or research materials, whenever appropriate. That is, authors should first consider whether sharing data and/or research materials would carry the potential of harm to subjects, violations of their privacy, or breaches of contract (e.g. where authors do not have property rights in data they use).  We recommend including an intention to share data in IRB applications, where applicable.  In the absence of the constraints noted here, PAR recommends:

2.1) Authors share quantitative data (and code, if available) by posting on Dataverse [https://wwww.dataverse.org/] or another public, open data repository;

2.2) Authors share qualitative data by posting on the Qualitative Data Repository [https://qdr.syr.edu/];

2.3) Authors share research materials such as survey instruments or interview protocols by sending them to the designated PAR Transparency and Openness editor after acceptance of their manuscript; PAR will then have them uploaded to the Wiley article page as supplemental materials once the type-setting of the article is complete.

The following badge will be added to an article once PAR has verified the public availability of the digitally-shareable data necessary to reproduce the reported results*:

To recognize authors who share research materials sufficient for reproducing the reported procedure and analysis, the following badge will be added to an article*:

* Please see the “Frequently Asked Questions” section on the PAR website for the full conditions.

3) Public Administration Review strongly recommends that authors document their research design, data preparation, and data analysis decisions and include details in a supplementary document. This document will include information that does not fit in the original manuscript due to space limitations.  Authors can send details to PAR after their manuscript is accepted.  PAR will then upload the document to Wiley.  In preparing the document, authors should follow relevant standards (see the “Frequently Asked Questions” section on the PAR website for guidance):

3.1) For reporting randomized controlled trials, follow the CONSORT standard;

3.2) For reporting quantitative observational studies, follow the STROBE standard;

3.3) For reporting meta-analyses and systematic review, follow the PRISMA standard;

3.4) For reporting qualitative studies, follow the COREQ standard.

4) Public Administration Review is a pluralistic and open journal that does not take a strong position on pre-registration.  To recognize authors who pre-registered their study design in a publicly verifiable location with a fixed time stamp and provided evidence of this to PAR, the following badge will be added to the article*:

* Please see the “Frequently Asked Questions” below for the full conditions.

 

Public Administration Review Transparency and Openness (TOP) Guidelines: Frequently Asked Questions (FAQ)

Below, we provide answers to questions authors may have about Public Administration Review’s Transparency and Openness (TOP) guidelines.  They are organized by topic; the first four sections cover practical guidance for authors, in the same order as in the guidelines for submission.  The fifth section covers general questions, including the rationale for adopting the TOP guidelines, the process for developing and revising them, and a discussion of implications.

FAQ section 1): Mandatory citation of data and research materials

Q: Could you provide an example of a data set citation? 

A: Here are three examples of data set citations, both quantitative and qualitative:

Boyne, George A., Oliver James, Peter John, and Nicolai Petrovsky. 2008. Leadership Change and Public Service Performance in English Local Governments, 1998-2006. Colchester, Essex: UK Data Service ReShare [distributor]. https://reshare.ukdataservice.ac.uk/850341/

Campbell, Angus, and Robert L. Kahn. American National Election Study, 1948. ICPSR07218-v3. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 1999. https://doi.org/10.3886/ICPSR07218.v3

Flom, Hernán and Alison E. Post. (2016) Data for: “Blame Avoidance and Policy Stability in Developing Democracies: The Politics of Public Security in Buenos Aires.” Data Collection, QDR:10068. Syracuse, NY: Qualitative Data Repository [distributor]. https://doi.org/10.5064/F6RF5RZV

Q: What should an in-text citation for data look like? 

A: Here is an in-text reference to a data source:

“We used the top management turnover rate from Boyne et al. (2008).”

FAQ section 2.1) Sharing quantitative data (and code)

Q: Why do you recommend authors to deposit data (and code, if available) at an academic data repository?  Why not share data through one’s academic website? 

A: PAR recommends using an academic repository for several reasons: First, academic repositories have committed resources to ensure your data will remain accessible into the indefinite future; at least for Dataverse, this includes making your data accessible in an open format, even if you used a proprietary software such as Stata to create it.  Second, academic repositories provide time stamps that document when files are uploaded and changed.  Third, academic repositories provide a unique identifier and persistent web link your materials, so that others will be able to easily find them well into the future. Researchers can use their links to encourage access and promote their work.

Q: I prefer to deposit data at ICPSR or at another national data archive participating in the https://www.cessda.eu/ network.  Some of these archives include access control.  Does PAR allow authors to use other repositories?  

  1. Yes; however, PAR recommends Dataverse because it allows free access to any interested party, whether or not they are affiliated with a member academic institution.

Q: Will PAR still provide a data badge if I use a data repository other than Dataverse? Can I still receive the data badge?

A: Yes; upon verification of the existence of the data at another data archive such as ICPSR or a national data archive, PAR will award the data badge.

Q: By uploading data to an academic data repository, do I give away ownership of the data? 

A: No, by uploading and sharing data, you do not relinquish any ownership rights that you may already have. Data is shared under a Creative Common license, meaning you hand over control of data for access purposes, but anyone who uses the data must give proper credit. If major concerns about sharing the data arise at a later point, there is an option to remove public access to the data.  Note that PAR requires citation of data, code, and materials by others.

Q: For how long should I make my data available? 

A: We recommend that you share materials indefinitely.  That is, after posting, you do not remove them.  Nothing in this world is permanent, but the data archives we recommend have committed resources to preserve your materials and keep them usable for a long time to come.  Who knows, there may be someone in 30 or even 100 years who could gain a new insight from your data!

Q:  Do all of the authors on a manuscript need to consent to share data and materials?  

A: PAR assumes that coauthors are in close contact and have discussed and agreed upon the specifics of any data sharing arrangement before the manuscript has been submitted to PAR.

Q: If the author(s) would rather share data and related materials at a later date, and not upon publication, is this possible?  

A: Yes, authors can submit a manuscript with an agreement to share data at a future date.  Several academics have expressed concerns about sharing data they have worked very hard to collect and code.  To be clear, PAR realizes that researchers invest considerable time and effort collecting original data and in building datasets.  Researchers may choose to use the data themselves for a period of time before sharing their work with others.  PAR has an option for this in its transparency questionnaire (via the Editorial Manager).  Submitters agree to release data by a specific date, choosing the option “[a] no later than [insert date]”.

Q: What if the data used in my research are proprietary were obtained from an institution (firm, government, non-profit organization, etc.) that has requested to review the results of the study prior to their dissemination to ensure that the confidentiality of the data is not unintentionally compromised?  Can researchers agree to release data at a later date? 

A: Yes. This is a good example of why data are sometimes released at a future date.  When you submit your manuscript, please choose the relevant option in the transparency questionnaire in Editorial Manager:

“[b] If this manuscript is accepted, the authors agree to make data associated with this research available within _____ months from the date of online publication.”

Q: If authors agree to share data at a later date, does PAR still issue a data badge? 

A: Yes, but it is up to the authors to notify PAR when data becomes available. Specifically, contact one of PAR’s Transparency and Openness Promotion Editors when the data are available. PAR then verifies that the data is available and updates the publisher, who in turn updates the article webpage with the relevant badge(s).  Note that the PDF of the article will not be changed.

Q: If I decide to share my analysis code, how can I be sure others can follow it? 

A: The best way to share code is to annotate as much as possible.  This is good practice for all researchers especially those who may want to use the code years later.

Q: How does the data sharing process relate to PAR’s review process?  Do authors submit data as part of the review process? 

A: No; PAR’s review process remains unchanged.  Only the manuscript and any optional associated appendices authors upload to Editorial Manager during submission will be reviewed.

Q: What if the authors wish to add to or edit the data or code originally uploaded to a repository.  Is there a process for this?  

  1. Yes, at least Dataverse provides this function and show time stamps to alert researchers to change dates.

Q: Does Dataverse provide information on downloads?

A: Yes, Dataverse provides users with download statistics.

FAQ section 2.2) Sharing qualitative data

Q: Where can I deposit qualitative data? 

A: PAR recommends archiving qualitative data, so long as research subjects are protected.  The best place to archive your qualitative data is an academic repository for qualitative data.  We recommend using the Qualitative Data Repository at Syracuse University: https://qdr.syr.edu/  Another option is the UK Data Service hosted by a consortium of British Universities and funding agencies: https://www.ukdataservice.ac.uk/

FAQ section 2.3) Sharing research materials

Q: I want to share research materials, but it can be hard to reconstruct what my team did.  How can I make sure to capture the relevant details? 

A: Gerber and Green (2012, 444) suggest building documentation into your research workflow:

This requires time and effort, and scholars are often too busy with the next experiment to properly archive the one they just completed.  One way to lower the costs of archiving is to make documentation and preservation part of your research procedures at each phase of the project.  For example, before sending your manuscript off to a journal for review, prepare a dataset, program, and accompanying materials so that anyone can reproduce and interpret your results.

For more information, see:

Gerber, Alan S. and Donald P. Green. 2012. Field Experiments: Design, Analysis, and Interpretation. New York and London: W. W. Norton.

FAQ section 3) Providing standardized additional information about research design, data collection, preparation, and analysis

Q: I want to share my analytical methods.  But I generally use the pull-down menu when conducting statistical analysis.  How I should I describe what I am doing? 

A: A straightforward analysis can be sufficiently described in the text.  For instance, if authors provide a table of descriptive statistics of all available observations in their data set and then conduct a regression analysis using all available observations and report it in a table, others can reconstruct these steps.  Many analyses, tests, and robustness checks are more involved, however.  Describing them in a note in the text may not be sufficient to allow someone else to fully reconstruct them.

Therefore, we encourage authors to keep a log of what they are doing when they are analyzing data.  Most popular software packages enable logs.

Here is some guidance on keeping a log in popular software packages (as of summer 2017):

SPSS:

“Go to Edit–>Options, then click on the Viewer tab. Somewhere in there–it looks a little different in every version–is a little box labeled “Display Commands in Log.” Click it, and you’re done!” [Then copy and paste the log into a text file/save the log as a text file.]

  1. http://www.theanalysisfactor.com/effortlessly-create-spss-syntax/

Stata:

After you opened Stata, type

log using myanalysis

[ENTER]

Stata will then keep all commands it executes (whether you choose them from the pull-down menu or type them in) as well as output in a log file called myanalysis.smcl in the current working directory.

R:

Either Rmarkdown (http://rmarkdown.rstudio.com/ ) or R Notebooks (http://rmarkdown.rstudio.com/r_notebooks.html ) will allow for other R users to reconstruct the steps from input to output.

SAS:

“A text file can be created from the SAS log using PROC PRINTTO.”  https://support.sas.com/kb/51/626.html

Q: How do I report relevant details of my research given PAR’s strict word limit for articles? 

A: There is enough space within a Public Administration Review article to provide the most important details regarding the application design and analysis.  In addition, PAR strongly recommends that authors document other relevant information in a supplementary document to be posted online at PAR’s website.  Supplementary information is highly specific to the research and will vary by manuscript but it may include for example, questionnaires or observation protocols.

To facilitate reporting of relevant aspects of the research design, we encourage authors to follow one of the established checklists for reporting research design and analysis.  Doing so will facilitate replication efforts.

In preparing the supplementary document, PAR recommends that authors follow one of the following guidelines:

3.1) for randomized controlled trials, the CONSORT standard:

→ Moher D,  Schulz KF,  Altman D. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomized trials, JAMA, 2001, vol. 285 (pg. 1987-91)   https://jamanetwork.com/journals/jama/fullarticle/193759

3.2) for quantitative observational studies, the STROBE standard:

→ STROBE Statement Strengthening the reporting of observational studies in epidemiology   https://www.strobe-statement.org/

3.3) for meta-analyses and systematic review, the PRISMA standard:

→ Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C., Ioannidis, J. P., . . . Moher, D. (2009). The PRISMA statement for reporting systematic reviews and metaanalyses of studies that evaluate health care interventions: Explanation and elaboration. Annals of Internal Medicine, 151(4), W-65-W-94.

Article:   http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1000100

Checklist:  http://www.prisma-statement.org/PRISMAStatement/Checklist.aspx

3.4) for qualitative studies, the COREQ standard:

→ Allison Tong, Peter Sainsbury, Jonathan Craig; Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19 (6): 349-357.   https://doi.org/10.1093/intqhc/mzm042

Q: CONSORT is about more than reporting; it also includes guidance for the design of experiments.  Am I bound by these? 

A: For PAR, CONSORT provides a helpful template for reporting experiments.  There is debate about the relevance of some of its standards to the design of experiments in public administration.  PAR is neutral in this debate.  Authors should justify their design decisions as part of their submission to PAR, whether or not they follow CONSORT standards.

Q: I find the reporting guidelines for political science experiments proposed by the American Political Science Association’s Experimental section more helpful than CONSORT for reporting the details of my experiment.  Would it be acceptable to use these guidelines (Gerber et al. 2014) instead of CONSORT for reporting the details of my experimental design and analysis?  

A:  Yes, it would be acceptable.  The PAR readership will benefit from detailed reporting using a thoughtful standard, be it CONSORT or an alternative, well-reasoned approach such as that proposed by the American Political Science Association’s Experimental section.  The purpose of reporting details of experimental design and analysis is to help others understand in detail how an experiment was conducted, so that they can better understand and perhaps replicate it.  Any standard that effectively furthers these aims is welcome.

Reference: Alan Gerber, Kevin Arceneaux, Cheryl Boudreau, Conor Dowling, Sunshine Hillygus, Thomas Palfrey, Daniel R. Biggers and David J. Hendry (2014). Reporting Guidelines for Experimental Research: A Report from the Experimental Research Section Standards Committee. Journal of Experimental Political Science, 1(1): 81-98. doi:10.1017/xps.2014.11 http://isps.yale.edu/node/21473

FAQ section 4) Pre-registration

Q: Do you have an example of a pre-registered study in our field? 

A: Yes, please see the forthcoming PAR article “Enlisting the Public in Facilitating Election Administration: A Field Experiment.”  You also find examples from related areas at:

http://ridie.3ieimpact.org/

https://www.socialscienceregistry.org/

https://osf.io/registries/

Q: What is preregistration and what is PAR’s position on pre-registration? 

A: Pre-registration of a study means that authors specify in advance, in a persistent, read-only format (e.g. at a pre-registration site) how they will conduct their study. Pre-registration is commonly used for medical research but it can be used in the social science research. The purpose of pre-registration is to provide the broader research community with an idea of the universe of studies being conducted, not only those studies that lead to interesting findings.

Preregistration of an analysis plan means that authors submit their plan of the analysis steps and tests they plan to a neutral third party, prior to having access to the data (e.g. in advance of an election or before they conduct their experiment).  The purpose of this is to make a clear distinction between work generating hypotheses and work testing hypotheses.  Both are necessary and valuable.  But confounding the two leads to fishing expeditions that capitalize on chance, leading to unsubstantiated inferences.

There is ongoing debate about the benefits and costs of preregistration, especially in political science (see for example, Monogan 2015).  Public Administration Review is a pluralistic and open journal that does not take a strong position on pre-registration.  We do give credit to authors who pre-registered their study and analysis plan and provided evidence of this to the PAR editorial team by adding the Center for Open Science’s Pre-registration badge to their article.

Reference: Monogan, James E. III. 2015. “Research Preregistration in Political Science: The Case, Counterarguments, and a Response to Critiques.” PS: Political Science & Politics 48 (3): 425-429.

FAQ section 5) General questions

Q: What happens to the Transparency and Openness Guidelines once PAR’s editorial transition has been completed? 

A: PAR will continue to use the Transparency and Openness guidelines after January 1, 2018.  Deanna and Nick will continue to serve as Transparency and Openness Editors.

Q: Why transparency? 

A: Public Administration Review seeks to publish research that is relevant both to scholars and policymakers.  In increasingly skeptical societies, policymakers have to be more transparent about how they do their work and how they arrive at their decisions.  Scholarly work is not immune from this broader societal skepticism.  If we want to retain credibility, and be able to continue participating in a debate with policy makers, we too have to be more transparent about our work, whenever doing so is practical and ethical.  Transparency of data and methods is also important for replicability of research, for building research programs, and creating opportunities for better comparative research.  In additions, graduate students can see some of the innards of an article, which will help them become contributors themselves.

Q: Where can I learn more about the Transparency and Openness guidelines and their use in different fields? 

A: The Center for Open Science’s TOP materials are here: https://cos.io/top

Q: Why doesn’t PAR adopt a stronger set of guidelines, with mandatory data sharing? 

A: Some of the most interesting work in the social sciences involves data collection on vulnerable populations or other sources that require confidentiality.  A policy of forced data transparency would make it difficult or impossible to conduct such research.  PAR therefore asks authors to first consider whether sharing data, research materials, and analytic methods would carry the potential of harm to subjects or violate their privacy, except when researching public figures such as elected officials.  We discourage transparency for those materials whose disclosure would potentially cause harm or violate subjects’ privacy.  We are also mindful of limited ownership of or usage rights for data.  Where authors are contractually barred from sharing a set of variables, it would not be meaningful for the journal to try to force them to do so.

Q: Do the Transparency and Openness guidelines mean that PAR favors quantitative manuscripts? 

A: No, Public Administration Review encourages the submission of quality manuscripts from many research traditions.

Q: Will PAR be less likely to accept my manuscript if I decline to use a data repository, to share data or code, or if I do not pre-register my study. 

A: No.  PAR does not base its acceptance decision on the author’s decision to share data and relevant materials, or to preregister a study.

Q: Is there a site where I can learn more about research transparency and experiences from related fields? 

A: Yes, as part of PAR’s efforts to draw on the experience of related fields and add to the debate among the broader PAR community, PAR’s Transparency and Openness editors have started a collection of collection of relevant materials, including scholarly articles and links to websites at: https://nickpetrovsky.com/par-top/

Q: Authors may have legitimate reasons not to share data and materials. Might PAR’s adoption of a tailored version of Level 1 of the Transparency and Openness guidelines lead to a perception that research not accompanied by shared data is of lesser quality than research that is accompanied by shared data? 

A: First and foremost, PAR’s guidelines are voluntary.  PAR has taken great efforts to ensure that the guidelines recognize different research traditions.  Authors who are concerned that others may wonder why data or materials are not shared in an open forum are encouraged to add a brief note of explanation.  In addition, PAR Editors acknowledge open practices are not always possible or advisable.  The guidelines explicitly note that data must not be shared when subject confidentiality would be violated or, more generally, harm to subjects would be caused.

Q: If I share data for an article published in PAR and later someone finds a mistake in my analysis, will my article be retracted? 

A: As a leading journal, PAR’s aim is to report accurate analysis.  Mistakes happen occasionally.  If a mistake were to be found, the appropriate response would typically be the publication of a correction, which would be linked to the original article.  A retraction would be a very unusual measure for the journal and not appropriate to the correction of a mistake.

Q: Does PAR have to use the term “transparency”? Some researchers suggest that “transparency” is a subjective term, implying that current work is not transparent.  

  1. PAR adopted the word “transparency” directly from the Center for Open Science’s Transparency and Openness Guidelines, so this is not something we can change. PAR is not implying any problem with existing research. However, we do believe that the TOP guidelines will encourage good practices and, hopefully, also the replication of research, which is key to advancing knowledge in the field.

Q: This is a serious initiative and we cannot be sure of all of the implications for research.  What if there are unintended consequences as PAR goes forward?  

A: PAR has been very thorough in developing its tailored version of the TOP guidelines.  We have also worked closely and collaboratively with the Editors, Editorial Board, and the broader community, eliciting input from anyone interested.  Still, problems may arise as we start using the guidelines. This is one reason we are implementing the guidelines at the lowest level – Level 1.  Level 1 allows for a conservative approach and room for assessment and adjustment along the way.  We will also continue to reach out to contributors to get feedback and make adjustments where necessary.

Q: I am afraid that the extra steps may impose additional burdens on junior scholars seeking to publish their work. 

A: During our consultations with stakeholders, junior faculty have been very supportive of the guidelines, probably more supportive than any other subgroup of stakeholders.  We believe that the TOP guidelines reinforce good research practices, which is consistent with the large majority of feedback we received during our consultations.  We will continue to make the process as supportive as possible, especially for junior scholars.

Q: Do you think following the TOP guidelines will discourage researchers from collecting original data?  A great deal of work goes into collecting and coding original data and researchers may not be so willing to make data available to others that have not contributed to the effort?  

A: We recognize the effort put into collecting original data.  For this reason, PAR mandates a full citation of any data collected by others.  In addition, we will continue to gather information on the collection of original data as PAR and other journals adopt TOP guidelines.

Q: What are the “badges” about and how are they earned? 

A: In signing on to the Transparency and Openness Promotion initiative, PAR’s Editors have the option of providing badges to authors.  To be clear, all of the badges are about disclosing actions, not about requiring actions.  PAR’s Editors offer three different badges to authors:

  1. The Open Data badge is earned for making publicly available the digitally-shareable data necessary to reproduce the reported results. Full criteria (cf. https://cos.io/top ):

1.1 Digitally-shareable data are publicly available on an open-access repository. The data must have a persistent identifier and be provided in a format that is time-stamped, immutable, and permanent (e.g., university repository, a registration on the Open Science Framework, or an independent repository at www.re3data.org )

1.2 A data dictionary (for example, a codebook or metadata describing the data) is included with sufficient description for an independent researcher to reproduce the reported analyses and results. Data from the same project that are not needed to reproduce the reported results can be kept private without losing eligibility for the Open Data Badge.

1.3 An open license allowing others to copy, distribute, and make use of the data while allowing the licensor to retain credit and copyright as applicable. Creative Commons has defined several licenses for this purpose, which are described at www.creativecommons.org/licenses . CC0 or CC-BY is strongly recommended.

  1. The Open Materials badge is earned by making publicly available the components of the research methodology needed to reproduce the reported procedure and analysis. Full criteria (cf. https://cos.io/top ):

2.1 Digitally-shareable materials are publicly available on an open-access repository. The materials must have a persistent identifier and be provided in a format that is time-stamped, immutable, and permanent (e.g., university repository, a registration on the Open Science Framework, or an independent repository at www.re3data.org ; this includes the Public Administration Review article webpage).

2.2 Infrastructure, equipment, biological materials, or other components that cannot be shared digitally are described in sufficient detail for an independent researcher to understand how to reproduce the procedure.

2.3 Sufficient explanation for an independent researcher to understand how the materials relate to the reported methodology.

  1. The Preregistered badge is earned when authors pre-registered their study design in a publicly verifiable location with a fixed time stamp. Full criteria (cf. https://cos.io/top ):

The Preregistered badge is earned for having a preregistered design. A preregistered design includes:

(1) Description of the research design and study materials including planned sample size, (2) Description of motivating research question or hypothesis, (3) Description of the outcome variable(s), and (4) Description of the predictor variables including controls, covariates, independent variables (conditions). When possible, the study materials themselves are included in the preregistration.

Criteria for earning the preregistered badge on a report of research are:

  1. A public date-time stamped registration is in an institutional registration system (e.g., ClinicalTrials.gov, Open Science Framework, AEA Registry, EGAP).
  2. Registration pre-dates the intervention.
  3. Registered design corresponds directly to reported design.
  4. Full disclosure of results in accordance with registered plan.

Badge eligibility does not restrict authors from reporting results of additional analyses.