Societally engaged universities – like the members of the Aurora network – put great value on the extent to which the research produced by their academics is not only academically excellent but also societally relevant and impactful. But how can a university – or a ministry on behalf of the taxpayer – make sure that this societal impact indeed does exist?
Universities always tend to look for some kind of system, a mechanism that ends up with a simple arithmetic number that lends itself to comparison – and governments do so even more. But any mechanism, any system is basically an invitation to ‘gaming’ and inevitably leads to perversion: it evokes not the desired behaviour, but undesired behaviour which satisfies all the system’s requirements.
With Societal impact & relevance of research, the problem is even more complex because of two inherent complexities of research impact: time and attribution. In plain English: by the time that the practice in society has really and significantly changed, it is hard to attribute specific portions of the change to individual research achievements.
Let’s return to the gaming and perversion aspect of the problem: a mechanistic solution of giving `impact points’ doesn’t work, but apparently we also can’t do without it.
Why not try for a combination for perception and facts – following in the footsteps of Robert Pirsig who in “Zen and the art of motorcycle maintenance” tried to ridged the gulf between ‘classical’ and ‘romantic’ outlooks on life and the world.
Why not say that we accept there is a societal impact of research if:
- the people out there (public, media, politicians) believe that there is.
Our marketers have been telling us for years now that this is the way to go: find captivating stories of research that are visual enough to make it to the Children’s News. Don’t fret that this may fail to disclose all the fabulous research findings that are less easily transformed into simple images and compelling soundbites.
Because it is only one of the requirements.
- the facts confirm the people’s belief.
Facts can never be conclusive, may be manipulated or carefully chosen. But research universities that take themselves seriously, need to make a serious effort to collect all the data and benchmarking information that helps to corroborate (or falsify) the identification of societally impact research.
And the proof of the pudding is in the third requirement.
- we ourselves truly and honestly believe it.
Self-respecting research universities need to be able to muster sufficient self-critical and self-reflecting capacity to be able to tell – and say out loud – if research isn’t really that much good, even when the public cheers it and the (manipulated) data sing its praise.
So that’s my note for this week: a suggestion to create a narrative of ‘societal impact and relevance of research’ on the basis of these three ingredients: two are perception, one is fact.
And that is a fact!