Skip to content

Finding mission clarity in fuzzy metrics

 

Some would say I drew the short straw when assigned the session on selecting and tracking performance metrics. And it’s true, sort of. Looking over the topics for an upcoming seminar to which ministry CEOs and board members are invited and where I’m one of three faculty members, there are sexier sounding titles.

focus_word_magnifying_glass_4743But save your pity. Assessment is an organizational issue about which I feel passionately and preach often.

For the most part, the folks with whom I work (leaders in faith-based nonprofits) agree that metrics matter and the majority do a fair job of tracking the basics. You know, things like number of clients served, dollars raised, and projects completed.

Shift the focus to impact, however, and the head scratching begins. How do we measure transformation? Life change? New perspectives? The power of the Gospel message on human hearts?

The irony of assessment and the metrics that support it is this: the factors that count most to mission fulfillment don’t fit neatly on a spreadsheet. As a result, most of what’s included on organizational dashboards and in reports to boards (important and useful as the information is) doesn’t provide a clue as to whether the ministry is fulfilling its mission.

Writing in the Stanford Social Innovation Review (Spring 2014), authors Lehn Benjamin and David Campbell offer four key considerations when developing methods for measuring outcomes that defy easy measurement. I encourage you to read the article in full. In the meantime, here’s some of what I’ll share from the article with the seminar group referenced at the beginning of this blog.

 DOWN WITH THE COUNT

Honor relationships. An outcome measurement framework should take into account the pivotal role that relationship work plays in the transformation process. Establishing such relationships can be (in fact, almost always is) an end as well as a means for many ministries.

Allow variation. The ebb and flow of participants’ lives, the necessarily improvisational nature of frontline adjustment work, and the desire to facilitate creative problem-solving all need point to a need for outcome measurement models that give space for variation. Identify short-term milestones that indicate the organization is headed in the right direction, or if off-track, needs to change course ASAP.

Respect agency. This one is all about humility. Instead of treating participants as objects of intervention, an outcome evaluation framework should acknowledge the co-determination work that the individuals pursue in collaboration with staff members. Outcome models should account for the self-efficacy of the person’s served by the ministry.

Support collaboration. To paraphrase the old adage, no organization is an island. An outcome measurement framework should focus on how a program fits into the life of a participant – not how the participant fits into the program. By recognizing that a program is one factor among many, we can better understand a whole range of factors work (or fail to do so) together.

Putting the four points into practice won’t be a snap. It’ll take intentionality and creativity. But when mission effectiveness is the goal, it’s worth the effort. As Peter Greer and Chris Horst write in their new book Mission Drift: The Unspoken Crisis Facing Leaders, Charities, and Churches (a must read for ministry leaders):

“ . . . to achieve the full aim of your mission, you have to be deliberate in what you evaluate. Mission True organizations find a way of stating and measuring what they believe matters most, even something as ‘fuzzy’ as outstanding customer service. . . To remain on mission, we need a deeper definition of success and a more thoughtful approach to metrics. . . Metrics help us remain accountable for the work that God has placed in our hands.”

Which explains my passion and frequent preaching on the topic. Are you with me?

For more on boards and metrics, see:

There are no small data, just small users.

Donors may not care about results, but boards should