Frequently Asked Questions

 

Some commonly asked questions about the work of Foundation Practice Rating.

Which foundations are covered?

The FPR assesses only charitable grant-making foundations. We have not included public grant-making agencies (such as local authorities or the research councils) because those have other accountability mechanisms.

How many foundations do you rate each year?

Around 100.

Will this include all UK grant-making foundations?

No, simply because they are more than the resources allow. We will assess:

1. All the foundations funding this project. They are all trying to learn and improve their own practice. This project is not about anybody hectoring anybody else. And:

2. A subset of other foundations. We will include all types of charitable grant-making foundation, e.g., endowed foundations, fund-raising foundations, family foundations, corporate foundations, community foundations. We will not include public grant-making agencies (e.g, local authorities or the research councils) because they have other accountability mechanisms. Our subset will be a mix of sizes (e.g, some of the largest, some mid-sized, some smaller ones). Our subset will be chosen at random from the list of the UK’s largest 300 foundations* as published in the Foundation Giving Trends report 2019 published by the Association of Charitable Foundations, plus the UK’s community foundations. We know that the money given is very skewed to the largest foundations. (The UK’s largest 10 foundations give over a third of the total given by the UK’s largest 300 or so foundations; and giving by the Wellcome Trust alone is 12% of that) but we also know that the transactions that many charities experience cover the range of foundation sizes.

*In fact, that report details 338 foundations. Our set will comprise those, plus 45 community foundations (the 47 listed by UK Community Foundations minus the two for whom no financial information is given), ie., 383 foundations in total.

How did you decide the sample for this year's Foundation Practice Rating?
  1. all the foundations funding this project. That is because this project is not about anybody pointing the finger at anybody else: the funding foundations are all being assessed as part of their work on improving. The foundations funding this project are listed on the partner’s page [1].
  2. the five largest foundations in the UK (by grant budget). This is because they are so large relative to the overall size of grant-making: the UK’s ten largest foundations give over a third of the total given by the UK’s largest 300 or so foundations. Giving by the Wellcome Trust alone is 12% of that.
  3. a stratified random subset of other foundations. We took the list of the UK’s largest foundations as published in the Foundation Giving Trends report 2019 published by the Association of Charitable Foundations[2], plus the UK’s community foundations listed by UK Community Foundations[3] for whom financial information is given. That gave 383 foundations. We then took a random sample which is a fifth from the top quintile (in terms of annual giving budget), a fifth from the second quintile, and so on.

 

[1] The foundations funding this project include the Joseph Rowntree Reform Trust and Power to Change, neither of which are registered charities. They are the sole two non-charities included.

[2] https://www.acf.org.uk/policy-practice/research-publications/

[3] https://www.ukcommunityfoundations.org/

Are there any exemptions to the criteria?

We have excluded some criteria if it is not relevant to an organisation, e.g. if you have a small staff and or trustee team (under 49) then we would not expect you to report on the gender pay gap. You can find a full list of exemptions here.

I’m a UK foundation. Can I opt into this?

Yes, by becoming a funder of the project. If you are not a funder then you maybe randomly chosen. Please contact Danielle Walker Palmour on danielle.walker@friendsprovidentfoundation.org.uk if you want to be part of the funding group and guarantee that you will get assessed.

What is the timetable on what will happen next?

The research is complete.

Launch date of the Foundation Practice Rating Report is on the 22nd March at 2pm. More details to follow.

 

Is this rating system just analysing our communucations?

The answer is no. It is seeing how transparent an organisation is, meaning it looks at what is published e.g. decision making about grants.

Is this an index or a ranking?

No, it is very important to stress that the system provides a rating of foundations, not a ranking.

Why a rating?

The reason we chose to do a rating and not an index or ranking is because prospective applicants experience absolute performance. A ranking is a zero-sum system: if somebody rises, somebody else must fall. One organisation’s gain is at somebody else’s expense. This is not how foundation practice works.

Who funded this project?

Check out the partners page to see who has funded this project.

How do I sign up to get updates on this project?

Keep an eye out on Twitter feeds of Giving Evidence’s Director Caroline Fiennes here and Friends Provident Foundation here.

Alternatively you can sign up by contact Giving Evidence here.

Who did the research?

The design of the rating system (including defining the criteria and research process) has been led by Giving Evidence, an independent consultancy. Giving Evidence works to encourage and enable giving based on sound evidence. Its role has been to develop the rating system, including the criteria and scoring system, and to produce the research and analysis for the ratings.

Will you publish the report in Welsh?

Yes, we will publish the report in Welsh.

What are the three pillars?

The FPR covers three ‘pillars’:

  • Diversity. The extent to which a foundation reports on the diversity of its staff and trustees, the extent to which a foundation reports on its plans to improve its diversity, and how well it caters for people who prefer / need to communicate in different ways, ie., how accessible it is? (We did not look at issues such as how well foundations capture views from a diverse set of stakeholders to inform their work, nor the diversity of the work they fund.)
  • Accountability. How can anyone who wants to examine the work or decisions of a foundation after the event do so, and make their voice heard?
  • Transparency. Does a potential grantee have access to the information that it needs to be able to contact the foundation, decide whether to apply for funding, or more generally in advance of any grant?
How do you calculate a foundation score?

We decided to convert each foundation’s score into a grade, in order to make the results easy to digest. We chose to have a system of four grades – from A (the top) to D. We have four grades partly because various UK public sector rating / quality assessment systems have four (e.g, Ofsted’s ratings of schools, HM Inspectorate of Prisons’ system has four, the Care Quality Commission’s system has four). We chose A-D because they are easy to understand.

We are publishing each foundation’s grade on each pillar but not the numerical scores. This is to prevent a ranking being constructed from the data, which we feel would be unhelpful for the reasons given earlier. 

We decided not to take an average of the three pillars for the overall score, we felt this would not be fair, an excellent score (A), would need require a certain level of achievement in all three areas, rather than an outstanding score in one or two areas. This is not unusual, for example Ofsted if a school is rated as ‘inadequate’ on any of the four ‘buckets’ of criteria that Ofsted assesses, it will be ‘inadequate’ overall: in other words, a school’s overall rating will not be higher than its lowest ‘pillar score’[1]. 

We use the same principle. If a foundation scores badly on any pillar, it cannot be said to be excellent and to warrant a high rating overall.

Please see an example below

Foundation

D score

A score

T score

Rating  based on the numerical average of its pillar scores

Actual overall rating

Reason

1

A

B

A

A

A

Lowest score (B) raised by one is the same as simple average

2

A

C

A

A

B

Lowest score (C) raised by one is B, which is lower than average score

3

B

B

B

B

B

Simple average is a B, and there is no reason to lower it

4

D

A

A

B

C

The lowest score (D) raised by one is a C which is lower than the simple average (B). This foundation is dragged downwards by its poor performance on diversity.

[1] Ofsted (2021), School inspection handbook: accessed at https://www.gov.uk/government/publications/school-inspection-handbook-eif/school-inspection-handbook#reaching-a-judgement-of-outstanding [10 January 2022]

How much does it cost?

Year £141,667 for year 1, this includes: Consultation, Publication, Comms, Research and Development.

For the following years, we estimate it will cost £87,500

The 10 funders or “the partners”  have covered the costs of the project. We are always looking for more organisations to support this important work.