Feedback about a feedback course


Yesterday, I had the opportunity to participate in a day-long workshop run by Feedback Labs; a D.C.-based group that helps organizations close feedback loops in their work with the goal of more authentically engaging end users, whoever those people might be.

Before I joined, I took their organizational feedback quiz (which you can take too). The results were insightful, if not surprising: P2PU excels at dialogue and designing good feedback practices, and we can use improvement analyzing this feedback and course correcting. As much as I’d like to chalk this up to being inevitable in an organization of our size, I knew that there were probably some practices and habits that we could refine to better implement feedback from our community, and I was excited to learn more about this. I was also excited to have the opportunity to reflect on the values behind our evaluation process, which I wrote about last year.

P2PU’s results from Feedback Lab’s diagnostic organizational feedback quiz.

By the end of the day, I picked up both specific, implementable ideas as well as some new guiding principles to which we should aspire. My worry going into the event was that I’d leave with lots of big ideas and no plan for implementation, but luckily that wasn’t the case!

Some bigger picture thoughts I left with:

  • Sharing data that we’ve collected about people with them can be more than just a legal requirement; it’s an opportunity to say “hey, this is our very partial picture of who you are: can you tell us where we are wrong to help us better do xyz?”.
  • We should never ask anything that we don’t plan to answer or implement. People give more open feedback when they trust that it will be cared for, and in today’s “please rate your experience”-heavy world, it is an uphill battle to gain peoples trust. A logical extension of this point is that participants should be involved in co-designing whatever results from the feedback that they gave.
  • We should always aim to give more than we extract when we ask for feedback from learning circle facilitators and participants. How can our request for dialogue help them achieve their own goals?
  • During the day, I started wondering whether our role is less to gather feedback ourselves, and more to help facilitators gather feedback that they think is important. To this point, if a facilitator doesn’t use our software tools to collect signup info, why don’t we just link to whatever tool they are already using from our site — surely that is better than nothing.
  • When soliciting feedback, sometimes it’s helpful to speak on behalf of the institution, and sometimes it’s better to approach people at an individual. I think I should be more mindful of when I write on behalf of P2PU, and when I write as Grif, who works at P2PU. I think the difference there is potentially non-trivial.
  • We can be more patient: often times, when I review survey feedback from a training workshop or a learning circle, I feel like I need to come up with a solution to that problem in the moment. No. Reviewing feedback is merely the first step. I should do my own analysis to set up a dialogue with community members about what we should do with this feedback, and only after that implement any changes.
  • It didn’t come up during the workshop, but I kept finding myself thinking of the old mantra “nothing about us without us”. It’s one of those slogans that can so easily become hackneyed but can also serve as a pretty constant paradigm to aspire towards: we can always be doing more to ensure that our work is developed by those who it affects.

And some things I want to implement immediately: 

  • Map all our existing feedback mechanisms and triage everything that isn’t providing immediate use.
  • Develop stronger hypothesis testing with our community and collectively work to solicit feedback in order to test our assumptions together.
  • Data analysis is a popular learning circle topic, but there isn’t a great open access course: we should look into developing one and using anonymized learning circle feedback as the dataset. A final project could be asking the group how they would improve some element of P2PU. Taking it a step further, how cool would it be if, instead of getting a survey from P2PU at the end of a learning circle, learning circle participants heard from a fellow learning circle participant who was studying Data Analysis?
  • Regarding the workshop itself: I found it very helpful to have a printed workbook. It was just 20 pages or so with lots of fill in the blanks. I could have easily just followed the slides on the projector and taken notes in my notebook, but I’m struck by how the workbook coerced my note taking into something that seems far more usable 24 hours later than the scrawl that I usually leave workshops with. I think we should start outlining what a similar workbook would look like for learning circle training workshops, and maybe even learning circles themselves!

The cover image here is a photo of an installation in the New York Times Building called Moveable Type. (This is where the feedback workshop was held). This installation algorithmically pulls headlines from the NYT’s 150 year archive to present a perspective of news that I found to be both bizarre and meditative.Thank you to Megan from Feedback Labs for running a high-quality workshop, and thank you to the Siegel Family Endowment for getting me there.



Back to Blog home