In 2020, Luminate commissioned the Center for Effective Philanthropy (CEP) to conduct the Grantee Perception Survey with our partners. The findings and key action items are included in this blog from our CEO, Stephen King. In the spirit of transparency and accountability, we also published our full Grantee Perception Report (GPR), as well as the memo of key findings and recommendations.
In an effort to pull back the curtain and to share some lessons learned, we’ve compiled some reflections from this process. We imagine this blog will be of interest to any funder seeking feedback from partners and/or considering doing the GPR in the future, as well as to civil society organisations who invest time in providing their responses to such surveys.
How Luminate approached the GPR:
- After receiving the results, we invited CEP to facilitate several of our internal meetings to ensure we were properly interpreting the findings. As a team, we had a tendency to “over-interpret” the findings — for example, to perceive major differences between our regional teams, where there wasn’t actually statistically significant evidence — so CEP helped us rein in that impulse.
- We ensured that every team member was able to understand and internalise the findings. To do so, we conducted substantive deep-dives into the GPR, including in discussions with our full team and Senior Management Team, and in smaller workshops.
- Our director of learning & impact facilitated workshops that allowed teams to reflect on surprises, takeaways, remaining questions, and tensions and to create an action plan for what they will stop, start, and continue to do, and how to communicate findings to partners.
What we’d like funders to know about our lessons learned on the GPR:
- Valuable, with buy-in: This type of feedback is extremely valuable, both as a baseline for a relatively new organisation, as well as a check on progress for more mature ones. Having leadership buy-in and team buy-in, however, is critical. We were lucky to have a highly engaged Senior Management Team, as well as supportive colleagues eager to learn, which made all the difference.
- Use partners’ languages: Investing in translation is important. For example, we ensured that our Latin America partners could read and respond in Spanish and Portuguese. 95% of these organisations took advantage of this, which also strongly suggests that English-only newsletters / proposals / surveys are excluding partners from using their preferred language.
- Cater to your curiosity: Adding customised questions tailored to your interests can be valuable. This year, we were particularly interested in learning more about partners’ thoughts around diversity, equity, and inclusion, as well as learning more about how they feel about non-monetary support. There are nuggets in these responses that we wouldn’t have received had we just used the standard survey.
- Carve out time: CEP delivers 100+ pages of quantitative and qualitative data. It takes time to process, so it’s best to ensure you and your team will read all of the comments, suggestions, and rankings that our partners have invested precious time to submit.
- Triangulate your data: We had the benefit of having information from human-centred research conducted by Simply Secure in 2018 — which provided a lot of qualitative data on what it was like to work with our team. It was powerful to view the two surveys alongside one another to discern trends and to underscore major areas for improvement.
What we’d like CSOs (especially our partners) to know about our lessons learned on the GPR:
- Your responses are safe: This process truly is confidential. We are unable to see how organisations responded or which organisations responded. Identifying information was deleted from comments, and CEP won’t provide data analysis to any regional or programmatic teams with fewer than six respondents.
- Your responses are met with interest and curiosity: There were some surprises, as well as tensions, in the rankings and comments we received. For instance, we understand that our proposal and decision-making process can feel long and burdensome, but partners also ranked the process as more helpful (and more funding received per hour invested) than most funders’ processes. The written comments are particularly memorable, and many of us keep referring to quotes from our partners.
- Your high response rate enables this feedback to be representative: We were thrilled to have a 71% response rate — higher than CEP’s typical response rate (65%) and higher than what CEP recommends will be representative (>50%). We thank our partners for participating at a rate that allows us to take findings as true representations of our partners, rather than feeling like it’s just a small swath of the sample.
- Your responses are valuable: We truly listened and continue to refer back to your comments, suggestions, and rankings, and we are acting on them. Thank you.
- We appreciate ongoing dialogue: The GPR provided a very rich snapshot of this moment in time, but we hope that our partners will feel comfortable to continue to let us know what’s working and what isn’t in our partnership. Please share questions, concerns, or comments you have with your Funding Leads.
We believe feedback is a gift, and we value the insights and rankings our partners provided. We will likely commission this kind of research again in two to three years, and we hope in the meantime that all of our partners continue to share their thoughts on how we can build stronger relationships and help support positive and lasting impact toward just and fair societies.