I speak from experience; managing a university workflow to determine who gets to apply for external research funding, and who doesn’t, is a difficult and thankless task. No one likes to be told ‘no’. And I think we can all agree that, when you don’t know who made the decision nor why, it’s even harder to accept and to move on.
As limited submission call increase in frequency, institutions may start to find that internal selection workflows are not only unsustainable, but they also border on being unethical. And as funders start to examine our decision-making processes, as part of wider equality, diversity and inclusion (EDI) action plans, perhaps the question we must ask ourselves is, how can we do this better?
Perspective of a Research Funder: limited submission calls manage demand
Competition for grant funding is fierce and funding budgets are stretched more than ever. The more applications received, the smaller the percentage of those successfully awarded. A significant concern, however, is the challenge of guiding large numbers of applications through the various stages of critiquing and assessment. The delicately balanced ecosystem of peer review is creaking under the strain of demand.
It’s worthwhile mentioning that there are multiple approaches that funders might take to implement demand management on a funding call, and limiting submissions on an institutional scale is just one option.
Other methods include restricting individuals to the number of applications that they can lead/co-lead on, limiting the number of awards that will be made or implementing a ‘no return’ policy. For example, individuals may only be able to apply to a scheme once in their career, or at a certain point in their career, or they are subject to a ‘three strikes and you are out’-type strategy that prohibits applicants applying after repeatedly submitting low quality proposals.
For the purposes of this blog, we will be focusing on limited submission at the institutional scale, although many of the themes that we will touch on are relevant irrespective of demand management approach.
Earlier this year the Engineering and Physical Sciences Research Council (EPSRC, part of the wider UK Research and Innovation, or UKRI) highlighted a concerning trend to its 36 Strategic UK-based Partner Universities. Peer review response data showed a drop of the number of researchers accepting invites to review, and an increase in the non-responders. This experience is not unique to EPSRC, however, and nor is it solely a challenge faced by research funders.
We presented this data to our institution’s researcher community – their feedback on the hidden burdens around committing to peer review request was entirely understandable, and sadly not surprising given the current extensive discussions around negative research cultures and inadequacies in academic performance review.
I won’t delve into this here, as I would be doing a great injustice to the community if I tried to sum up such a complex and emotive topic in a few sentences. Yet the impact of low peer reviewer responses on the funding application journey is clear – with larger volumes of applications coming through the funder portals, and longer lead times on securing adequate peer review, it puts immense pressure on the funder to ensure applications receive a fair critiquing to enable informed and timely decision-making. It benefits applicants and funders alike, if we strive for a more sustainable future for the peer review system.
Confessions of a Funder Grants Manager; peer review system pressures
I worked for a research funder for ten years, and one of my core responsibilities was identifying potential peer reviewers, contacting them and negotiating for the funding applications within my portfolio. I use the word ‘negotiating’ purposefully, as this is exactly what it is. Bargaining on timelines and expectations.
I have felt the rising panic of having had to contact 20 or more academics for one application, in the hope of securing a minimum of 3 peer review reports and within sufficient time of the looming Panel date. I found it to be the most challenging part of the role, as the time I needed to invest in this process varied wildly between applications and was largely out of my control. It did, however, evoke a real sense of camaraderie amongst the Grants team. When faced with the unspoken horror of having not secured enough peer review, we would often share suggestions between us of potential reviewers or new disciplinary avenues to explore.
Peer review, within the context of research funding, also refers to the role of the Panels or Committees that support the Funder and who reflect on the independent comments of the experts to rank the applications received. I’ve been very aware of the excessive pressures that a large funding round would have on our Committee Members.
Logistically there really is a limit to how many applications one Panel can discuss in one day. But even more importantly, a limit to how many applications you can reasonably expect a Panel Member to read, given the large volume of paperwork involved and time needed to read the paperwork in sufficient depth to make a recommendation on prioritisation.
Part of my role was to make the experience as comfortable for the Panel as possible – I can only begin to imagine how many cups of coffee and biscuits were consumed, and how many times I heard the phrase ‘we haven’t time to dwell on this one’.
Just a sidenote to any researchers reading this, funders look for peer reviewers and potential Panel members predominantly through key word browser searches and your institution webpage. It’s a good idea, therefore, to keep your university profile up to date. Unless you would rather not be found, in which case, you can ignore that last part!
Perspective of a Research Institution: limited submission calls add multiple layers of complexity
Once a limited submission call is published, those responsible for institutional pre-award support will all be asking themselves the same question – do we need to do ‘something’ to prepare for this? And what might that ‘something’ be?
I have had this discussion with my Research Manager and Administrator (RMA) colleagues on multiple occasions over the most recent four years of my career – and I am almost certain that many of you reading this, have too. Each time a limited funding call is published, we must make a judgement on how to proceed.
This thought process will be influenced by the specific detail of the demand management measures in place, the focus of the call and breadth of the funder’s remit, and the degree to which your institution’s research strengths align with what the funder is looking for.
Commonly we refer to ‘hard’ or ‘soft’ to categorise the measures put in place by the funder to manage demand. It could be ‘hard’ if it relates to an actual cap on number of submissions from an institution, or it might be ‘soft’ if the funder simply recommends (or strongly encourages, in a shouty, bold text kind-of-way) that institutions undertake internal prioritisation but does not put in place a limit.
It could be a bit of both, as I’ve encountered previously, where the funder starts with a ‘soft’ approach but states that it may decide to introduce a ‘hard’ cap later in the application process, if it deems that the number of submissions in progress is exceeding a certain threshold (usually gauged following an ‘Intention to Submit’ or ‘Expression of Interest’ stage). So, we are faced with a conundrum; as Shakespeare might have said, had he been a pre-award RMA; ‘to sift, or not to sift, that is the question’. Because, let’s be honest, no one wants more processes, nor more forms to fill out, if we can possibly avoid it!
Confessions of a University Research Development Manager: pre-award barriers and inequalities
I think there is the perception that limited submission funding calls, or any type of demand management measure, inherently lead to higher quality funding applications. Whilst quality assurance may not necessarily be the primary reason to implement demand management, it could be considered a happy byproduct.
In theory, limiting numbers of submissions from institutions has the potential to drive a more strategic and purposeful approach to grant proposal development and support. Sometimes this is true, as evidenced by the Natural Environmental Research Council (part of UKRI) Demand Management Review 2015-2017, which showed a reduction in low scoring and uncompetitive applications after a limited submission approach was introduced for Standard Grants. Yet more broadly, I fear the reality is far from ideal.
A recent study led by the Elizabeth Blackwell Institute, University of Bristol and MoreBrains Cooperative on the issues of equality, diversity, inclusion, and transparency in the process of applying for research funding delved into the depths of the pre-award environment.
Through a six-month long discussion with stakeholders from universities across the UK, the authors uncovered systematic biases, obstacles and inequalities experienced by researchers and RMA staff in accessing, and supporting access to, research funding calls.
The findings have been presented as a workflow diagram capturing the entire pre-award ecosystem, from initial conception of project ideas and identifying suitable funding mechanisms, through to finance approval and institutional sign-off. This report resonates deeply with me as a Research Development Manager (so much so, I volunteered my time to participate in the initial workshops).
The collective voice in the report gives visibility and credibility to the concerns that RMAs in the pre-award environment have been observing and experiencing for many years – previously shared solely in hushed conversations between colleagues during coffee breaks, or the exchange of uncomfortable looks when once again having to communicate the outcome from a questionable and opaque decision-making process.
Demand management of funding calls is mentioned only briefly, as an element of the much broader pre-award landscape, but the message is clear – many universities are simply not equipped with the right knowledge, tools and resources to ensure that limited submission workflows are implemented consistently and, in a fair, transparent and equitable way.
Working Together: limited submission calls are just one part of a much bigger story
So how can we move forward? The University of Bristol and MoreBrains report provides an invaluable starting point – by pinpointing the gaps and inadequacies in current workflows, we can start to challenge what we do and how we do it. And by crystallising the study outputs into 11 recommendations for institutions and funders, the authors have taken an important step forward in supporting the practical implementation of possible solutions. But as the authors conclude,
‘… (the EDI) benefits will not in and of themselves be enough of an incentive to make substantive and challenging changes to the research system. They need to be reinforced with consequences for poor practices.’
Change takes resources (new or re-directed) and effective management. I can see many instances where the strategic action planning, and operational delivery of these changes will naturally fall on my RMA peers, who are experiencing their own daily struggles with excessive workloads and a never-ending to-do list.
We need a driver for change – to foster an ecosystem where poor limited submission workflows are bad for business, whilst equitable, transparent and inclusive workflows are championed as shining beacons of good practice. That’s where funders could play a pivotal role, and several funding bodies are already moving in this direction.
I am most optimistic about the plans emerging from UKRI, on a deep dive of university internal sifting and selection processes within the EPSRC portfolio, and as part of its EDI Action Plans. Strategic Partner Universities have received a short survey in the first step of a longer-term programme aimed at understanding current approaches and issues to limited submission workflows.
It seems that funders, and UKRI as a particular example, are starting to appreciate that they have a vested interest in the pre-award environment, before an application is even submitted, should they truly want to realise goals in promoting inclusive practices across the research funding landscape. And there is so much that institutions can learn from each other, as well as from our funders, in designing, implementing and delivering on peer review assessment and decision-making.
Join me in a free webinar ‘Putting a Cap on it: the challenges and successes in creating workflows for limited submission research funding’ (10th October 2023) where I will give an insight into such workflows, gained from my own experiences working for both a funder and in academia, as well as those shared with me by my peers. And we can start the conversation of how we do things better, which is the true call to action from the University of Bristol and MoreBrains report;
‘… funders, researchers, and research professionals within Higher Education Institutions (HEIs) need better connections and communications to increase transparency in the pre-award environment. If they are to present a united front to create systemic change, and work together to lobby for shifts in government policy where appropriate, they also need to be proactive in cultivating more mutual understanding and alliances.’
Find the University of Bristol and MoreBrains report at:
Bell, A et al. “If we use the strength of diversity among researchers we can only improve the quality and impact of our research” Issues of equality, diversity, inclusion, and transparency in the process of applying for research funding (2023) Elizabeth Blackwell Institute, University of Bristol and MoreBrains Cooperative. https://zenodo.org/record/8186347
About the Author:
Dr. Jaydene Witchell is a Research Development Specialist with a PhD in Molecular Biology and a passion for empowering researchers by making grant funding more accessible. Her expertise on the complexities of research funding is built upon ten years working for the global charitable foundation Wellcome Trust, managing the grant application process and a portfolio of Wellcome’s expert review panels. During that time she facilitated the assessment of over a thousand research grant proposals and read thousands more accompanying peer reviewer reports.
Jaydene was keen to use her funder panel knowledge in a more applied way and to support applicants during the grant writing and development phase. She is now nearing four years of working within university Research and Innovation Offices, firstly for Cranfield University and more recently, the University of Southampton.
She inspires engagement and influences change by providing guidance to researchers navigating the funding landscape and by demystifying the grant application process. Jaydene is a keen advocate of the Research Manager and Administrator community and is starting a new venture in championing sharing of best practice between Research Development teams.
Jaydene will be a guest speaker for a free webinar we are hosting on October 10, 2023. You can learn more and register here.