On Measuring Creativity & Other Abstractions

vancouverfilmschool-creativity-fi

6 Questions On Measuring Creativity & Other Abstractions

by Grant Wiggins, Ed.D, Authentic Education

Oh, you can’t measure that

I hear this in almost every conversation about assessment of long-term educational goals. “Critical and creative thinking – oh, you can’t measurethat.” Why are educators so quick to say things like this? On its face the claim is a bit odd: if we can measure AP art portfolios, the quality of an Olympic gymnastics routine, or music performance in New York State through its NYSSMA competitions, then we can surely measure critical and creative thinking – or any other goal typically found in school and district Mission statements.

Recently, I read a great book that might be of interest to anyone who wants to get beyond a knee-jerk reaction about what can and can’t be measured. The book makes the point from the git-go, in its title: How to Measure Anything: Finding the Value of Intangibles in Business, by Douglas Hubbard. Don’t let the ‘in Business” part throw you off. Almost everything in the book speaks to educational outcomes.

So, where does he begin? By tackling our prejudices:

“For those who believe something to be immeasurable, the concept of measurement – or rather the misconception of measurement – is probably the most important obstacle to overcome…. The error is to assume that measure = certainty. The mere reduction of uncertainty will suffice for [most] measurements.”

Hubbard addresses the key misconception via his definition of measurement: “Measurement: a set of observations that reduce uncertainty, where the result is expressed as a quantity.” The phrases “reduce uncertainty” and “expressed as a quantity” make clear that the point is improved precision and reduced uncertainty about how we’re doing in some important area. The goal is to get a useful measure of our key goals (critical thinking, creative thinking), not an infallible perfect measure. In fact, there are no infallible measures of anything worth measuring.

Hubbard offers a set of questions that everyone in educational reform and accountability ought to be considering when designing institutional feedback mechanisms – but usually aren’t:

  1. What are you trying to measure? What is the real meaning of the alleged ‘intangible’?
  2. Why do you care? What are the decisions that arise from this goal?
  3. How much do you know now? What ranges or probabilities represent your uncertainty about this? (What degree of uncertainty is tolerable?)
  4. What is the value of the information? What are the consequences of being wrong and the chance of being wrong?
  5. Which information would confirm or eliminate different possibilities?
  6. How do you conduct the measurement to account for various types of avoidable but common errors?

He cites many examples of people who were consistently measuring the wrong things or trying to measure things that weren’t in the end worth measuring so complexly – once they addressed these questions. This is arguably the current reality of assessment in education! We are either overly-simplifying complex aims or failing to measure something that matters because we think our measures must be perfect.

The Impossible-to-Measure Problem

A core tactic in developing sound measures is to do what we always do in workshops: ask people to identify agreed-upon indicators of success or failure at the goal in question using a simple T-chart: what are examples and non-examples of whatever the goal is: critical thinking, creative thinking, persistence, etc.? What, then, are indicators of critical vs. uncritical thinking in the same situation? Creative and uncreative thinking? The answers quickly make clear that the goal is measurable.

If we struggle to develop indicators and encounter people who persist in saying that something can’t be measured in quantitative terms, Hubbard offers us a devilishly simple yet highly effective strategy. Can you come up with your own 90% confidence interval for any “impossible-to-measure” goal? For example, if I asked you how confident you are about results from a multiple-choice test of creative thinking, you would likely retort: Oh, you can’t measure creative thinking with such questions!

But I persist, using his approach: I show you a 25-question test of creative thinking, and ask: is there some wrong number of questions from this test that might convince you that the thinker is likely to be uncreative? You might then haltingly respond that anyone who gets “most” wrong perhaps isn’t as likely to be as creative as someone who gets them all right.

What if I then say: how many wrong would it take before you are 90% sure? You might say: most = more than 20.  Yes, you say, if you produce weak answers to that many questions it is highly likely that you are less creative thaqn someone who produces clever, imaginaitve answers to each question. But then you are acknowledging that the test measures creative thinking in some way to an adequate level of confidence. Again, the goal is reduced uncertainty in numerical terms, not perfection. (No perfection in measurement exists, not even on vocabulary quizzes as test of control of vocabulary).

Other Notes

Note, therefore, that this argument should cause us to doubt the wisdom of many current scoring schemes used on conventional tests and quizzes. In far too many cases we arbitrarily say a 60 is passing but a 59 is not – not because we validly examined and set the cut score based on looking at the qualitative differences between the answers but based on utterly arbitrary mathematics.

Be forewarned in reading the book: once you get into the middle and late chapters, a comfort level with statistical methods is required for understanding his arguments and examples. But the book pays dividends for those readers willing to persist. Highly recommended for anyone with a responsibility for educational results ands/or measuring them. (Hint: i.e. all of us.)

Here’s a link on the publisher’s site that gives you a free sample of the opening chapter:

http://media.wiley.com/product_data/excerpt/99/04705393/0470539399.pdf

Image attribution flickr user vancouverfilmschoolThis article first appeared on Grant’s personal blogfollow Grant on twitter; 6 Questions On Measuring Creativity & Other Abstractions