Please note the CCCC office will be closed on December 25, 26, 27, and Jan. 1, 2025.
performance ratings for charities
Checklist boxes for Excellent, Very good, Good, Average, and Poor, with Excellent checked. Photo by Dominik Gwarek from FreeImages

Here are my thoughts on the latest attempt to rate charities by MoneySense. Overall, I think the approach is better than others we’ve seen in Canada because they went beyond the information available in the T3010 government return and asked the charities to complete a survey on topics such as governance, privacy and transparency. They also did not rank charities, they just rated them. So, congratulations on a good effort.

Problems with the Ratings

However, in spite of numerous cautions expressed in the ratings article, which I think are fully warranted, the rating system gives a final grade as an overall assessment. The existence of a final grade undermines an otherwise fine attempt at rating charities. Few people will have the inclination to do any of the more nuanced follow-up work that MoneySense suggests, because (human nature being what it is) they will take the path of least effort and simply rely on the final grade. (How’s that for a pessimistic assessment of humanity!  Sorry, but I think it’s true.)

How to Improve Charity Ratings

Here are some ideas for improving the rating system:

  • When examining fundraising costs, use a multi-year period instead of a year at a time. While direct mail fundraising, for example, should raise immediate dollars, major gifts and deferred gifts can take a year or sometimes decades (in the case  of bequests) to materialize. Capital campaigns usually take three years to convert pledges into cash receipts. So while the revenue comes in over an extended time, the costs to get that revenue tend to be incurred upfront. This makes older charities look better than younger charities because they are now receiving donations from work that was paid for years before. Using a three-year rolling average would be a more realistic way of determining fundraising costs as a percentage of money raised. It’s still not perfect, but it’s better than using only one year.
  • For both overhead costs and fundraising costs, any rating system assumes that all charities allocate their expenses the same way. While I think charities are getting better at this due to changes in accounting rules and CRA guidelines, I suspect there is still a wide variation in how costs are allocated and therefore the percentages that raters are so eager to calculate are not likely as solid as they believe. Even if everyone reported based on the same allocation criteria, a number is just a number until it is compared to something useful. When assessing charities, the focus should not be on inputs such as administration and fundraising, but on the outcomes that those expenditures generate. A higher administration expense could lead to better oversight or even better quality staff, potentially achieving much greater social good. The real evaluation of a charity should be based on its ability to use its inputs for the greatest possible social good. So shift the focus from inputs to outcomes as the primary focus (effectiveness), and leave the cost of achieving those outcomes as an important but secondary consideration (efficiency). I’m not taking the time to hit the books while writing this post, but there is a good literature available on measuring the effectiveness and efficiency of intangible missions such as many charities have.
  • Sarah Efron, the originator of this particular rating system, acknowledges quite correctly that there are complicating factors that make it impossible to do a pure number-crunching exercise alone:
    • Some causes are more popular than others, so it is easier to raise money,
    • Some charities are household names while others are brand new and have to do a lot more promotion to get noticed,
    • Some charities make greater use of volunteers and thus have lower overhead costs,
    • Some charities work only locally and others nationally or internationally, adding to their oversight costs,
    • Some fundraise nationally, incurring greater costs, while others are closer to their donors because they only fundraise locally, and
    • Some charities operating in the same ‘business’ receive government funding and some do not.
    • Therefore it makes sense to highlight these differences in the report.
  • It would be more helpful if, in the governance rating section, the report mentioned whether or not a charity responded to the survey. They may have poor governance or they may not respond to inquiries like this. Both reflect negatively, but they are very different from each other. Poor governance is much more of a concern to me than not disclosing information. (I can think of no good reason, but the way, why the information should not have been provided.)
  • To be fair, charities should be sent a draft report and given a chance to add their comments to explain any results that they feel do not fairly reflect their operations.
  • The underlying assumption of a rating system for charities is that donors are interested in the “return on investment.” But when investment analysts assess corporate investments (stocks and bonds), they do not rely only on financial reports and checklists, they also visit the companies, interview their CEO’s and dig into their strategies. Without this sort of analysis (suitably adapted for the charitable sector), any charity rating system is deficient. But who would pay for such a system? Foundations normally do their own due diligence that includes these extra assessments, but they only work with so many charities. Is it practical to think that 85,000 charities can be rated? If you only rate the largest ones, are you penalizing the smaller charities?

Standards Are the Better Way

Maybe we wouldn’t need a charity rating service if we combined the external validation that comes from having standards such as CCCC provides for Christian ministries with greater transparency by charities who would make information freely and easily available on their websites. Our Accreditation assures donors that a third party has validated a charity’s compliance with a set of standards, and by posting information on a website, donors can find out for themselves if what the charity is doing is effective, efficient, and what they want to support.  Christian charities should speak for themselves rather than depend on rating services and post the sort of information that the public wants on their website, including:

  • complete financial statements,
  • annual reports,
  • policies such as privacy, fundraising, executive compensation, etc.,
  • governance issues such as requested by MoneySense,
  • a logic model that supports their mission statement,
  • a definition of success, and
  • a discussion of outcomes, effectiveness and efficiency (if not covered in the annual report).

It seems to me that providing this information is something a charity would want to do because if they are performing well, the information will be a persuasive communication to current and potential supporters. Doing the work to produce the information would also be very beneficial for the charity. For example, I am thinking a lot about effectiveness and efficiency at CCCC and by getting stuff written down, I am forcing myself to rigorously think the issues through.

A lot of the information that MoneySense wants to see is posted on our website (financial and governance info here, and some sample program evaluations here), but I haven’t posted a logic model or definition of success yet. We’re in the process of rethinking our mission statement, which will determine our logic model, definition of success, and our outcomes measurement criteria, so those will be posted once we have done that work.

So, those are my thoughts. I’m especially interested in hearing from charity staff what they think of charity rating systems and how they think they can best be transparent with the public. If you don’t like the rating systems, what do you suggest instead?

Sign up for Christian Leadership Reflections today!

An exploration of Christian ministry leadership led by CCCC's CEO John Pellowe