Privately-funded trusts and foundations are almost uniquely unaccountable. They are independent from government, and most do not need to fundraise, so they don’t rely on anyone else. This has its strengths, in that they are not swayed by political agendas, for example, and can support causes that may be unpopular. It allows them to speak truth to power, if they choose. But in terms of the balance of power between them and those they fund, they hold all the cards. Apart from their duty to comply with charity law, they can operate largely with impunity. As Danielle Walker-Palmour, director of Friends Provident Foundation (FPF), says: “Only in philanthropy is it still common practice to use family membership as the sole qualification for inclusion in decisions on the deployment of large sums of capital.”
Foundations are also perceived as somewhat opaque, with only 218 UK funders publishing their grants data through 360Giving. The sector has not devised any common standards for reporting on grants or investments beyond regulatory standards.
The sector also falls short on diversity. The 2018 research for the Association of Charitable Foundations (ACF) by Bayes Business School threw this into sharp relief, showing that 99% of UK foundation trustees are white, two-thirds are men, and three in five are aged 64 or over. Anecdotally, similar patterns are believed to exist across foundation staff teams, too.
For a sector with assets of more than £62bn, that gives out more than £4.7bn a year even when there is no pandemic, this is not good enough. Many philanthropic grantmakers exist to tackle inequalities and disadvantage, to support those on the margins, and to promote social justice. But if they do not represent the communities they seek to serve, refuse to open themselves up to scrutiny, and are unwilling to learn and better themselves, then they cannot make best use of their resources and maximise their impact.
In recent years, awareness of these shortcomings has grown within the sector, and this knowledge has become increasingly unpalatable. Now a group of progressive foundations, led by FPF, has come together to try to address these issues. Realising that their own objectives to improve society cannot properly be met unless they are able to lead by example, 10 forward-thinking grantmakers (see Figure 1 available in the PDF at end of article) have launched a project to measure the performance of themselves and their peers on their approaches to diversity, accountability and transparency. What’s more, they’ve put their money where their mouths are and agreed to fund the costs of the research. The aim of the work, which will continue annually, is to identify and promote good practice and, where failings are uncovered, to encourage those trusts and foundations to improve. It is hoped that the work will inspire a new culture of openness and accountability across the sector.
In this, the first year of the project, the Foundation Practice Rating assessed 100 UK foundations – all but two of which are charities – on 90 questions grouped under the three pillars of diversity, accountability and transparency (see Figure 2). Using only information that is in the public domain, the researchers analysed the practices of the selected trusts and gave each one a score on each question, with the total score used to calculate a rating for each of the pillars (see Figure 3). Where possible, they assessed the foundations against standards and ratings that already exist, such as the Social Mobility Employer Index, Glasspockets’ Transparency Standard and the Racial Equality Index.
Within each pillar, each criterion had equal weight, and each pillar had equal weight. Then, they arrived at an average of the three pillar scores to deliver an overall rating for each participating trust.
How it worked
The researchers scoured publicly available information, on websites and in annual reports, to assess each grantmaker. They mimicked a grant-seeker examining a potential new funder. Each researcher spent up to 90 minutes looking at each foundation, as that was the maximum time they estimated that a prospective grant applicant would take. The research was done without foundations’ permission and the foundations assessed had no influence over the findings.
The questions were published before the research began, and a public consultation in May and June 2021 invited feedback. Once the final criteria were determined, these were published and promoted, along with guidance on how to do well against them. Every item that was used as criteria is provided by at least one foundation in the sample – ensuring that none of the criteria is impossible to meet.
Data on each trust were gathered from September to December 2021. Once the research was complete, all the trusts were sent the data about themselves to check before the final report was compiled.
Each foundation was assigned a score of A, B, C or D for each pillar (with A being highest), plus an overall score (see Figure 4).
The project leader, FPF, and the organisation that designed and conducted the research, Giving Evidence, underlined that the system was deliberately designed to provide a rating of foundations, as opposed to a ranking (which scores participants relative to each other) or an index (which is intended to track changes over time). A rating gives absolute performance, meaning that if all participants are doing well, they can all receive a high score; likewise, if several are falling short in an area, this will be evident too.
The researchers noted that the data published was gathered at a specific point in time and some may not still be correct by the time of publication. FPF’s Walker-Palmour said: “Foundation practice constantly evolves. Some foundations told us that they had recently recruited new trustees, or got a new policy for something, which takes a while to be reflected on its website and even longer to get into annual reports. So our research is a bit like trying to capture a moving target, and one with a lag; material in annual reports takes particularly long to be updated.”
Disclosure, not activity
The team also emphasise that because they relied wholly on publicly available information, the research relates only to what a foundation discloses, which could be different to what it actually does. Walker-Palmour explained: “For example, if a foundation does an excellent job involving a diverse group of stakeholders but does not talk about that in its published material, it gets no credit for that in our scoring – even if we know about that excellent work from our wider work in the sector.”
The team stresses that results on diversity of staff and trustees are based on disclosure, not performance. Scores were assigned based on whether foundations publish their diversity breakdown, not on the diversity itself. This means that points were awarded for disclosing the gender, ethnicity and disability of staff and trustees, even if those staff and trustees are all non-disabled white men.
Similarly, if there is a genuine reason why a foundation doesn’t have a policy on something that is within the criteria, and it explains why in publicly available material, it would have got credit for that, and not been marked down.
Three As overall
Just three foundations achieved the top rating of A: Wellcome, the largest foundation in Europe; the Blagrave Trust, an endowed funder of youth organisations and young people with assets of around £42m – the fourth of five quintiles by giving budget; and the County Durham Community Foundation, a fundraising foundation. WalkerPalmour said: “These three are very varied. This suggests that good practice is not dependent on any one structure or size.”
Some 41 foundations scored B overall; 28 scored C, and 28 scored D (see Figure 3).
Overall, the sampled foundations did best on transparency, and worst on diversity. The average score for transparency across the whole sample was B, for accountability it was C and for diversity it was D.
Some 53 foundations scored at least one A on the pillars, with 17 scoring two. But low scores were just as evident: 47 trusts scored at least one D, and 22 of the foundations assessed scored Ds across the board.
Best practice
Community foundations scored better than average, as did the group of 10 foundations funding the project: all scored A or B overall. Of the five community foundations that were included, one scored A overall and the other four were rated B.
The best collective scores related to: publishing an investment policy; having a website; stating who the staff are; and publishing details of funding priorities and past or existing grantees.
The researchers highlighted several examples of excellent practice, including:
- An appeals process for rejected applicants (County Durham Community Foundation).
- Easily visible buttons across the top of a website which enlarge the text on all pages (Cumbria Community Foundation).
- Clear presentation of funding priorities in various formats – PDF, video and slideshow (Lloyd’s Register Foundation).
Biggest room for improvement
The worst collective scores related to: publishing a breakdown of the diversity of trustees and staff, and any plan to improve that; publishing in Welsh; and providing contact mechanisms for disabled people.
The 22 foundations that scored D on all three pillars span the size range by giving budget. None has a website and around two in five did not provide an email address.
The researchers found several instances where foundations require something from their grantees which they do not do themselves. Examples of this include paying the living wage, consulting with beneficiary communities about priorities, and having complaints or whistleblowing procedures.
Another anomaly they highlighted was the insistence that successful grant applicants produce evidence of their impact, when the foundations did not provide any analysis of their own success. Caroline Fiennes, director of Giving Evidence, said: “A reason that foundations give is that analysing a grantmaker’s impact can be difficult because its effects are vicarious through its grantees. In our view, this is a poor excuse.”
Fiennes says that there is plenty a foundation can do to assess its own success, such as analysing the proportion of grants which meet its primary goals, against the proportion that don’t. “They can then compare that to the characteristics of the grants/ grantees. It will show whether they succeed most often with grants in (say) Wales or Scotland, or small grants vs larger grants, or small grantees vs larger grantees. Almost all foundations’ work could be analysed in that way, and it would give great insight into how they can be most effective.”
Does size matter?
Performance did not appear to rely on financial clout; no clear trends emerged according to either giving budget or net assets. Good and bad practice could be found in all sizes of organisation, proving that you don’t need to be large or wealthy to set high standards – or to sink to low ones.
However, patterns did emerge in relation to the size of teams and boards. Foundations with no staff tended to score lower than those with some employees. The very largest (teams of 100+) scored best but interestingly, those with 11-50 employees had a higher proportion of Bs than those with 51-99 staff.
It was a similar picture with regard to the number of trustees: foundations with five or fewer tended to perform worse, with over half scoring a D overall. No foundation with 11 or more trustees scored a D overall. All three of the top scorers have six to 10 trustees (see Figure 4). There was a correlation between having more trustees and scoring better, particularly on accountability.
Diversity
This was easily the weakest pillar across the sample – there is lots of room for improvement here in the sector. All but three foundations scored C or D on diversity (which includes accessibility). None scored an A. Sixteen foundations scored zero on diversity. (Foundations with 10 or fewer staff were exempted from the criteria on staff diversity and those with five or fewer trustees were exempted from the trustee diversity questions.)
At the outset of the project, the researchers attempted to measure the diversity of the foundations’ staff and trustees in relation to gender, ethnicity and disability, but there was so little data available that they abandoned this ambition.
They found that on staff, only four foundations – Barrow Cadbury, Power to Change, Wellcome and Comic Relief – publish a diversity breakdown; on trustees, the only foundation to publish a diversity breakdown was Rhodes Trust which publishes its ethnic breakdown. Only 14 trusts had a diversity plan for staff and of these, only Wellcome provided any targets within its plan. Ten foundations had a diversity plan for their boards, with Esmée Fairbairn Foundation the only one to include targets for improvement within this. By contrast, 48% of FTSE250 firms publish a board diversity policy.
The researchers said: “Though many foundations publicly affirm their commitment to equality, diversity and inclusion and provide statements indicating a willingness to improve, few of those statements contain clear targets or goals about how a foundation intends to improve its diversity over time. A statement is not a plan.”
The diversity pillar also included questions relating to accessibility, such as whether a foundation’s website met accessibility guidelines and whether the organisation offered different ways for people to get in touch or to apply for grants. The study found that standards generally fell below those required of the public sector and suggested that more foundations should proactively work with disabled people to review their websites and practices to ensure they are more inclusive.
Transparency
A total of 51 foundations scored an A for transparency but despite some very good practice in this area, there was also some disappointingly poor practice. Just over a quarter of foundations didn’t even have a website. And among some that did, the site was either too sparsely populated or too cluttered, making information hard to find.
The project team acknowledged that some foundations do not have websites, or do not disclose certain information, because of the nature of their work. “Some foundations which fund human rights work want to avoid attracting attention, particularly to their grantees, because that may imperil them.”
Rachel Hicks, head of marketing and communications at UK Community Foundations, said that one of the community foundations assessed for the Rating was in the midst of relaunching its website, and told her that the criteria questions had helped enormously to inform its design and content. “So the Rating is already meeting its targets of encouraging people to think about their practice,” she said.
Accountability
Eighteen foundations scored an A for accountability and far more participants scored Cs and Ds than As or Bs. The researchers found that few trusts offered an obvious complaint mechanism and some provided no email or phone number.
Only around a third published any analysis of their own effectiveness, even though most ask this of the organisations they fund.
Looking to the future
FPF expects that the sample in future will again comprise the 10-strong funders group and the five largest foundations, along with a fresh random but stratified sample of other foundations. This may or may not include those that were assessed this year. The project team acknowledged concerns raised by some foundations that if they are not reassessed in future years, they will not have the opportunity to demonstrate improvement, and said they were considering how to tackle this. But they feel that it is important that all foundations realise that they could be included, so that there is an incentive for all of them to improve. Rating a wider range of trusts would also provide a more faithful picture of progress across the whole sector.
The criteria will likely be exactly the same as for this first year, for continuity.
The full report, and a list of how each foundation performed, can be downloaded at www.foundation practicerating.org.uk
Figure 1: Project funders.
The 10 project funders:
- Friends Provident Foundation
- Barrow Cadbury Trust
- The Blagrave Trust
- Esmée Fairbairn Foundation
- John Ellerman Foundation
- Joseph Rowntree Reform Trust
- Joseph Rowntree Charitable Trust
- Lankelly Chase Foundation
- Paul Hamlyn Foundation
- Power to Change
Figure 2: The criteria used for assessment.
Broadly, the principles set out for the three pillars were:
Diversity: The extent to which a foundation reports on the diversity of its staff and trustees; the extent to which a foundation reports on its plans to improve its diversity; and how well it caters for people who prefer or need to communicate in different ways, ie how accessible it is.
Accountability: Is it possible to examine the work or decisions of a foundation after the event, and to communicate with that foundation about these?
Transparency: Does a potential grantee have access to the information that it needs in order to contact the foundation, decide whether to apply for funding, or learn about it more generally in advance of any grant?
Download the figures from this article here.Related articles