Design-driven evaluation and the mindset of innovation

Design-driven evaluation views the evaluative act as part of the service offering to innovation, not just a means of assessing its outputs, processes, and outcomes. It’s not a shift in method, but in mindset — and it can make all the difference.

Evaluation is the innovator’s secret advantage. Any sustained attempt to innovate is driven by good data and systems to make sense of this data. Some systems are better than other and sometimes the data collected is not particularly great, but if you look at any organization that consistently develops new products and services that are useful and attractive and you’ll see some commitment to evaluation.

Innovation involves producing something that adds new value and evaluation is the means of assessing what that value is. What design-driven evaluation does is take that process one step further and views the process of data collection, sensemaking, decision-making, and action as part of the value chain of a service or product line.

It’s not a new way of evaluating things, it’s a new mindset for how we understand the utility of evaluation and its role in supporting sustained innovation and its culture within an organization. It does this by viewing evaluation as a product on its own and as a service to the organization.

In both cases, the way we approach this kind of evaluation is the way we would approach designing a product and a service. It’s both.

What does an evaluation of something produce? What is the product?

The simple, traditional answer is that an evaluation generates material for presentations or reports based on the merit, worth, and significance of what is being evaluated. A utilization-focused or developmental evaluation might suggest that the product is data that can be used to make decisions and learn.

Design-driven evaluation can do both, but extends our understanding of what the product is. The evaluation itself — the process of paying attention to what is happening in the development and use of a product or service, selecting what is most useful and meaningful, and collecting data on those activities and outcomes — has distinctive value on its own.

Viewed as a product, an evaluation can serve as a part of the innovation itself. Consider the tools that we use to generate many of our innovations from Sharpie markers and Post-it Notes to our whiteboards and wheely chairs to Figma or Adobe Illustrator software to the Macbook Pro or HP Envy PC that we use to type on. The best tools are designed to serve the creative process. There are many markers, computers, software packages, and platforms, but the ones we choose are the ones that serve a purpose well for what we need and what we enjoy (and that includes factoring in constraints) — they are well-designed. Why should an evaluation — a tool in the service of innovation — be any different?

Just like the reams of sticky notes that we generate with ideas serve as a product of the process of designing something new (innovating), so can an evaluation serve this same function.

These products are not just functional, they are stable and often invoke a positive emotional appeal to them (e.g. they look good, feel good, help you to feel good, etc.). Exceptional products do this while being sustainable, accessible, (usually) affordable, and culturally and environmentally sensitive the environments in which they are deployed. The best products combine it all.

Evaluations can do this. A design-driven evaluation generates something that is not only useful and used, but attractive. It invites conversation, use and showcases what is done in the service of creating an innovation by design.

The principles of good product design — designing for use, attraction, interaction, and satisfaction — are applied to an evaluation using this approach. This means selecting methods and tools that fit this function and aesthetic (and doesn’t divorce the two). It means treating the evaluation design and what it generates (e.g., data) as a product.

The other role of a design-driven evaluation is to treat it as a service and thus, design it as such.

Service design is a distinct area of practice within the field of design that focuses on creating optimal experiences through service.

Designers Marc Stickdorn and Jakob Schneider suggest that service design should be guided by five basic principles :

If we consider these principles in the scope of an evaluation, what we’ll see is something very different than just a report or presentation. This approach to designing evaluation as a service means taking a more concerted effort to identify present and potential future uses of an evaluation and understanding the user at the outset and designing for their needs, abilities, and preferences.

It also involves considering how evaluation can integrate into or complement existing service offerings. For innovators, it positions evaluation as a means of making innovation happen as part of the process and making that process better and more useful.

This is beyond A/B testing or forms of ‘testing’ innovations to positioning evaluation as a service to those who are innovating. In developmental evaluations, this means designing evaluation activities — from the data collection through to the synthesis, sensemaking, application, and re-design efforts of a program — as a service to the innovation itself.

Design-driven evaluation requires a mindset of an innovator and designer with the discipline of an evaluator. It is a way of approaching evaluation differently and goes beyond simple use to true service. This is not an approach that is needed for every evaluation either. But if you want to generate better use of evaluation results, contribute to better innovations and decision making, generate real learning (not learning artifacts), then designing a mindset for evaluation that views it alongside the care and attention that goes into all the other products and services we engage in matters a great deal.

If we want better, more useful evaluations (and their designs) we need to think and act like designers.

Photo credits: Cameron Norman, Mark Rabe on Unsplash, and Carli Jeen on Unsplash.

Cameron Norman is a designer, psychologist, educator, and strategist focused on innovation in human systems. He is the Principal and President of Cense Ltd., a human services design consultancy that focuses on learning and impact through innovation and evaluation.

Design-driven evaluation and the mindset of innovation

Research & References of Design-driven evaluation and the mindset of innovation|A&C Accounting And Tax Services
Source

error: Content is protected !!