How the Rapid Response Fund was allocated

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp
Share on email

In the span of six weeks, the Rapid Response Fund has been launched and allocated with the last few payments being made by the end of this week. It feels very strange to be on the other side of things and even if we’re not quite finished yet, I wanted to take the opportunity to reflect on the decision-making process while it was fresh in my head.

When it came to the decision-makers, we had 14 people step forward, of which 13 got to take part in the process (the last one applied shortly before we closed the fund so we never had an opportunity to involve them this time). Between them, they sat on five panels in different iterations to try and ensure that different opinions got mixed together and minimise any conflicts of interest that they had made known to us.

The design process which we ran earlier in the month had concluded that decision-makers should be compensated for their time – this was, after all, a huge responsibility which required a lot from the decision-makers.

Conflicts of interest


These aren’t always a bad thing in participatory grantmaking because the whole point of it is that communities are given the power to make decisions about things which impact their lives. This necessarily means that people are more likely to know the people or organisations asking for funding. 

Participatory grantmaking asks communities to deal with the more difficult questions around trust and transparency, rather than put applicants through a lengthy process or exclude people who are already active.

The process

Other than filtering out applications which clearly did not meet the basic criteria and then redacting personal information on the applications themselves, we did not interfere with the application process in any way. All applications were assessed in chronological order until the funding ran out.

For each application, there was a Google Form to fill in which asked the decision-makers to score the applications out of five against the four criteria which came out of the design process. There was also space after each question to add additional comments that they wanted us to bear in mind for due diligence.

The scoring system used to decide if applications were funded was:

  • For projects asking for £100-2,999.99, a simple majority average score secured funding
  • Projects asking for £3,000-5,000 needed a qualified majority (above 70%)
    • Less than 50% = proposition is rejected
    • Over 70% = proposition is accepted
    • Between 51 and 69% = the proposition is passed to another panel
    • If the second panel does not give it above 70% the proposition is rejected 

We estimated that it would take up to between 4 and 5 hours to do each batch of applications, assuming someone spent about 20-30 minutes on an application. As such, we set the payment at £50.00 per batch completed, roughly in line with the London Living Wage (£10.53ph) and what someone could expect from taking part in a research project with a similar time commitment. We also offered alternative compensation of equivalent value as cash payments do not work for everybody, often due to employment status. 

  • All of the decision-makers went for cash payments
  • Ten respondents were women, four were men
  • On average, they did spend about 4 hours on a batch of applications.

Difficult choices

Decision-makers who weren't designers expressed a lot more anxiety around funding than the others, often asking for more details and detailed breakdowns and putting a lot of emphasis on their personal culpability

About half of the decision-makers had been involved in the design process which led to an interesting split in people’s attitudes. The design process involved introducing people to what participatory grant-making could look like – they were presented with three short case studies which you can read here and they also had the chance to ask questions of Geraud and I. 

Decision-makers were given a brief document outlining the ‘rules’ of the grant process but they were not given any introduction to participatory grantmaking – we weren’t testing anything out, it was simply down to a lack of time. My hunch from an initial pass at the data is that the average score of designers is slightly higher than those who weren’t involved in the design process.

In addition to this, decision-makers who weren’t designers expressed a lot more anxiety around funding than the others, often asking for more details and detailed breakdowns and putting a lot of emphasis on their personal culpability if something they funded didn’t work out as planned. They were also more likely to express their personal opinion that a project could be done on a tighter budget, whereas the others trusted that people needed the amount of money they said they needed. The designers had wanted the fund to be simple and accessible and so were more accepting of applications which were simple and short when it came to making decisions.

In future, I would like to take the time to introduce all decision-makers to participatory grant-making. It doesn’t need to be a lengthy process but just enough so that they feel better equipped to make decisions – given the number of follow-up questions from decision-makers I had to answer about why the process was the way it was, it seems like a useful time investment and also serves to bring the decision-makers further into the work and hopefully inspire them to take part in future design sessions.

Trust me, I’m a grantmaker?

 

Trust is at the core of participatory grant-making and the thing about trust is that you can’t conjure it out of thin air – it has to be based on something and each of us will have a different comfort threshold. The people involved in the design group had been given the opportunity to build up trust with their community, with us as a funder, but also with themselves – the design process meant they had to trust their own experiences and examine their perspectives.

What this means for our work is that there isn’t a one-size-fits-all approach – some people and communities will require more time to build strong foundations of trust. This process is therefore slower than a top-down approach but it has the potential of encouraging networking, community, and collaboration in a way that traditional funding models do not.

Onwards

Overwhelmingly, there was a desire to help the community from the decision-makers. One of the more interesting observations I made was that even those who made negative comments about applications would still score them fairly if they had demonstrated they had met the criteria. Given the way applications were scored, had someone wanted to scupper an application, it would have made the most sense to score it with 1s and 2s but these were a fairly uncommon occurrence. 

When they did occur, the decision was then backed up with an explanation – the panel members implicitly knew what a low score might mean for an applicant and wanted to put that score into context. What this meant, however, was that more applications got funding than I expected from just looking at the comments – we were then able to use any criticisms as part of our due diligence and to frame the agreements we made with applicants.

I think what this fund shows us is that this group of people were willing to engage their community and to take their ideas in good faith​

This is just one fund so I’m not going to make sweeping statements about the inherent good/evil nature of humanity because I don’t think things are that simple. I think what this fund shows us is that this group of people were willing to engage their community and to take their ideas in good faith – the task ahead of us is to find out what we as a funder need to do to encourage people to participate in our work.

We’ll be doing a more formal evaluation process with the decision-makers which will provide both hard and soft data which will help inform our work going forward and allow us to make decisions which are led by people who are part of Barking & Dagenham.

This was by no means a perfect process but I’d like to think it was a pretty good one.

More news