2023 ASEE Annual Conference & Exposition

Overlooked, Underlying: Understanding tacit criteria of proposal reviewing during a mock panel review

Presented at Exploration of Written and Team Communication

This paper abstract is being submitted as a research paper to the ERM division of ASEE.

Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet, the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. To our knowledge, no study has evaluated how reviewers develop these skills that are key to the peer review process. This paper reports on findings from a peer reviewer training program exploring how scholars evaluate NSF grant proposals. Using a lens of transformative learning theory, this study seeks to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the reviewer process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review?

Prior literature shows that reviewers have implicit biases and personal epistemologies that influence their reviews. Peer review recommendations may be driven by previously constructed schemas borne out of a reviewer’s professional training, disciplinary culture or prior experiences. An added complexity of the NSF grant proposal review process is the panel discussions among reviewers. Unlike manuscript review, reviewers engage in active dialogue that reflects multiple perspectives, levels of experience and disciplinary backgrounds. These discussions explicitly reveal the bases upon which reviewers make their recommendations, which opens the potential for transformative learning to take place.

Our study is situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program officers with experience running discipline-based education research (DBER) panels. Participants were mentored in developing inclusive and constructive reviewing practices in quads of one mentor and three mentees. Mentees reviewed three previously submitted proposals to the NSF. Individually, mentees of each quad drafted pre-panel reviews regarding the proposals’ intellectual merit, broader impacts, and strengths and weaknesses relative to solicitation-specific criteria. Mentees then participated in mock panel reviews, facilitated by two mentors. Groups discussed and ranked each proposal and drafted funding recommendations. Following the mock panels, mentees could then revise their pre-review evaluations, based on the panel discussion.

Using a holistic case study approach on one mock review panel, we conducted document analyses of mentees’ reviews, analyzing reviews from six participants before and after their participation in the mock review panel. Findings from this study will provide better understanding of the basis upon which professionals evaluate grant proposal reviews, and the extent to which their perspectives change after participating in panel discussions. These results have the potential to inform review panel practices as well as training methods to support new reviewers in DBER fields.

Authors
  1. Ms. Evan Ko University of Illinois at Urbana - Champaign [biography]
  2. Prof. Rebecca A. Bates Minnesota State University, Mankato [biography]
  3. Dr. Gary Lichtenstein Arizona State University [biography]
Download paper (818 KB)

Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.