LSTM Home > LSTM Research > LSTM Online Archive

The peer review process for awarding funds to international science research consortia: a qualitative developmental evaluation [version 1; referees: 1 approved, 1 approved with reservations]

Downloads

Downloads per month over past year

Gregorius, Stefanie, Dean, Laura ORCID: https://orcid.org/0000-0002-4910-9707, Cole, Donald C and Bates, Imelda ORCID: https://orcid.org/0000-0002-0862-8199 (2017) 'The peer review process for awarding funds to international science research consortia: a qualitative developmental evaluation [version 1; referees: 1 approved, 1 approved with reservations]'. F1000Research, Vol 6, Issue 1808.

[img]
Preview
Text (Version 3)
12496_-_Imelda_Bates_V3.pdf
Available under License Creative Commons Attribution.

Download (1MB) | Preview
[img]
Preview
Text
34b23c5a-3c26-4706-907e-67854506b728_12496_-_imelda_bates.pdf - Published Version
Available under License Creative Commons Attribution.

Download (329kB) | Preview
[img] Text (supporting email)
FW Your article is now published.msg - Other
Restricted to Repository staff only

Download (891kB)

Abstract

Background: Evaluating applications for multi-national, multi-disciplinary, dual-purpose research consortia is highly complex. There has been little research on the peer review process for evaluating grant applications and almost none on how applications for multi-national consortia are reviewed. Overseas development investments are increasingly being channelled into international science consortia to generate high-quality research while simultaneously strengthening multi-disciplinary research capacity. We need a better understanding of how such decisions are made and their effectiveness.

Methods: An award-making institution planned to fund 10 UK-Africa research consortia. Over two annual rounds, 34 out of 78 eligible applications were shortlisted and reviewed by at least five external reviewers before final selections were made by a face-to-face panel. We used an innovative approach involving structured, overt observations of award-making panel meetings and semi-structured interviews with panel members to explore how assessment criteria concerning research quality and capacity strengthening were applied during the peer review process. Data were coded and analysed using pre-designed matrices which incorporated categories relating to the assessment criteria.

Results: In general the process was rigorous and well-managed. However, lack of clarity about differential weighting of criteria and variations in the panel’s understanding of research capacity strengthening resulted in some inconsistencies in use of the assessment criteria. Using the same panel for both rounds had advantages, in that during the second round consensus was achieved more quickly and the panel had increased focus on development aspects.

Conclusion: Grant assessment panels for such complex research applications need to have topic- and context-specific expertise. They must also understand research capacity issues and have a flexible but equitable and transparent approach. This study has developed and tested an approach for evaluating the operation of such panels and has generated lessons that can promote coherence and transparency among grant-makers and ultimately make the award-making process more effective.

Item Type: Article
Additional Information: Version 1 published 06/10/2017 doi: 10.12688/f1000research.12496.1 Latest version published 16/01/2018 doi: 10.12688/f1000research.12496.3
Subjects: W General Medicine. Health Professions > W 20.5 Biomedical research
W General Medicine. Health Professions > Professional practice > W88 Administrative work. Teaching. Research
WA Public Health > Health Administration and Organization > WA 530 International health administration
Faculty: Department: Clinical Sciences & International Health > International Public Health Department
Digital Object Identifer (DOI): https://doi.org/10.12688/f1000research.12496.3
Depositing User: Rachel Dominguez
Date Deposited: 05 Jan 2018 11:45
Last Modified: 25 Jan 2018 14:24
URI: https://archive.lstmed.ac.uk/id/eprint/7672

Statistics

View details

Actions (login required)

Edit Item Edit Item