The Art of Prioritizing What (and Who) Matters in EvaluationHere we are—email four in our Evaluation Design series. We’ve got two more design-focused topics coming your way before we shift into the nitty-gritty of data collection: strategies, tools, and what to do when things don’t go as planned. This summer, much like the rest of the year, has been full of contradictions. Just last week, my family and I traveled to South Carolina to visit relatives I hadn’t seen in years. The garden is lush, the kids are home from school, and we’re squeezing every bit of joy out of long days and late bedtimes. And yet, back home in Minnesota, we’re mourning a tragic act of political violence. That kind of emotional whiplash is becoming all too familiar—the coexistence of beauty and heartbreak, growth and grief. If you’re feeling that too, know that I’m holding space for you. Now onto today’s topic: prioritization. So far, we’ve talked about what evaluation questions are, how to define them clearly, and how to design them collaboratively. But what happens when you have too many questions? It’s easy to get excited and want to explore everything—but if you try to learn everything, you often end up learning nothing well. Let’s get practical. Step 1: Revisit your list.You might have three questions. Or fifteen. Or thirty. No shame—this is where many of us start. Step 2: Filter by accountability.Ask:
Step 3: Filter by decision-making.Ask:
Now take a look—what’s left? Step 4: Look for overlap.Some questions may be addressed (at least partially) through others. Are there sub-questions that can be rolled into bigger ones? Can one strong question carry more weight with a few added prompts or measures? This should get you to a manageable 3–5 core questions. Still have more than that? No problem. Think in phases or sprints. Design your evaluation efforts in 6-month windows. What can you answer now, and what can wait? And don’t do this alone.Prioritization shouldn’t happen in a vacuum. Involve your team, your partners, and—when possible—your community. Different stakeholders bring different definitions of what matters. The best evaluations reflect that diversity. You’ve got this. And if you don’t want to do it alone, you don’t have to. I offer coaching packages that span 6, 9, or 12 months—think of it as having a thought partner in your back pocket to help you sort through priorities, guide your process, and build internal capacity as you go. If that sounds like what you need, get in touch here. Thanks for being here. If you found this helpful, please consider forwarding it to someone else who’s in the weeds of evaluation design right now. And if you’ve got a tricky prioritization question or want feedback on your evaluation questions, hit reply—I’d love to hear from you. Until next time, P.S. If you missed the earlier issues in this series, here’s what we’ve covered so far: 📩 Reply to this email or reach out to learn more! |
Organizations partner with me to provide a comprehensive measure of their community impact and demystify the program evaluation process. In my newsletter, I share practical tips for enhancing your program evaluation process.
Let’s Talk About the Questions That Matter Most When organizations think about evaluation, the first thing they often think about is data collection. But before you build a survey, schedule interviews, or create a dashboard, there’s a foundational step that deserves your attention: Deciding what questions your evaluation is meant to answer. These aren’t your survey questions — they’re the deeper, guiding questions that shape the direction and purpose of your entire evaluation. These are your...
Bridgepoint Evaluation May 26th Let's stop throwing spaghetti at the evaluation wall. ↓ Hi Reader, I was leading a survey design workshop the other week when someone leaned back in their chair, paused, and said, “I can think of the questions I’m asking… but I have no idea what the bigger question is.” I’ve heard some version of that line more times than I can count. And honestly? It’s one of my favorite moments—because it means we’re moving beyond checking boxes and starting to look at the...
Are you collecting more data than you can manage? Let's fix that. Nonprofits and mission-driven organizations collect a lot of data—surveys, interviews, spreadsheets filled with numbers. But if you’ve ever felt like you’re drowning in data while still struggling to find clear insights, you’re not alone. Here’s what typically happens:➡️ Teams create a one-off survey for a project.➡️ Another team member conducts a few interviews for a grant report.➡️ Someone else pulls numbers from a database...