Damn Sociology II: The Contradiction of Neoliberal Statistics, Research Time, 24/03/2015

Continued from last post . . . There is a second level of depth in Road to Serfdom’s rejection of sociology and the corresponding rejection of sociological knowledge in new liberal politics generally. It has to do with how particular kinds of knowledge gathering affect how you understand the world and humanity generally. Or at least, how you presume such knowledge to affect sociologists.

Sociology as a scientific discipline emerged from the creation of statistical mathematics, and the application of these mathematics to the massive collection of data on citizens that European states had been collecting since the first government population survey in 1749 in Sweden under King Gustav III. Statistical analysis discovers general trends of behaviour or social activity, evidence for what the population wants and needs.

The Harper government's original reason for removing
the mandatory long-form census was that its questions
invaded the privacy of Canadian citizens. Apparently,
that duty is for the security services, not StatsCan.
Extrapolating a population’s tendencies for wants and needs, their desires, from statistical data is the cardinal affront to individual liberty, from which sociology can’t be redeemed. This is my theatrical way of expressing the irreparable hostility right-wing neoliberal politicians seem to have for sociology. Road to Serfdom is the expression in political rhetoric of the revival in liberal philosophy that developed from the Austrian School’s ideas in economics.*

* I’m indebted to Steve Fuller for filling in some of the ideas behind the hostility of Austrian School economists to statistics in sociology and statistical analysis in economic modelling of desire. There was a brief mention of Hayek in his new book Knowledge, but asking him about it directly for our long-running review/dialogue would be too much of a divergence.

Fuller’s interpretation of the Austrian School conception of statistics runs like this. A person’s preferences are unique to them, so each person’s reasons to make the same action as another person are inevitably different. That divergence can’t be expressed in a quantitative measure of the action that several individuals made. The sociologist only measures that everyone made the same action. 

Everyone’s reasons for coming to the same outcome never reach the quantitative sociologist. So the sociologist would see a false commonality and uniformity of thought, where there was really just the momentary convergence of irreconcilably different individuals. 

Worse, the sociologist imparts a kind of collective consciousness to the group of individuals. They perceive a false necessity in the organic movement of a culture instead of the true contingent agreement of unique individuals. That’s why sociologists advocate for state policies that manage a population of individuals on the presumption that their convergence in action indicates unity in everyday thought. Sociologically-informed policy therefore results in government activity that ignores the individuality, and therefore the basic liberty, of the population.

An elderly Friedrich Hayek,
fighting the same fights he did
fifty years before.
Now anyone who knows anything about the current theoretical mainstream of sociology knows that no one seriously believes that a measured trend in opinion is the expression of a collective consciousness. That’s ridiculous.

Nonetheless, it fits the strange intellectual heritage of popular libertarian thinking, where an essentialist conception of a phenomenon from its founding era is carried into the present. Hayek’s was a time of world-shattering conflicts with militarized collective states, the end of an era where societies were popularly conceived much more organically than we now know them to be. Libertarians are fighting a battle that they’ve already won.

What’s more, the dehumanizing power of statistical knowledge is more frequently used in institutions that are ruled entirely be new liberal political principles, where public institutions are run as if they were private industries. Take the university system, for example. 

I came across an article in the London Review of Books last week, where Marina Warner, an instructor at the University of Essex, described how the management priorities of her institution were being shaped by business norms that were improper to education. Much of this was similar to what we experience here in North America. 

She describes a university bureaucracy whose priorities were set by administrators who did not and had not ever worked as researchers or instructors themselves. Professors were informed that they were to spend their time, not writing books or articles or actually carrying out the research to do so, but writing grant proposals to public or private institutions to fund their research. They had to fill out timesheets to be submitted to administration, attesting that they were actually doing research.

The biggest problem for the university sector in the UK, according to Warner’s piece, is the means by which researchers’ performance is evaluated in the first place. The Research Excellent Framework (REF) is a statistical guide to account for research impact. Nothing wrong on the face of it, but it measures only those impacts that occur within disciplines, when humanities work is most impactful in its direct popular reception. Like Hayek’s Road to Serfdom.

The Albert Sloman Library at University of Essex.
Research products in the humanities are most driven by an individual’s identity. A humanities scholar negotiates a field of study and concurrent research done by others, advancing ideas that they develop individually. It’s where research accords most with individual expression. The best products of humanities research are books and essays that develop perspectives and interpretations which are stimulating and informative to their readers, but which would have been completely different if someone else had written them. 

Warner gives the example of a literature scholar who splits what would have been a book about Shakespeare, Blake, or Moore into four academic journal articles to satisfy immediately the demands of REF impact assessment, which concentrates on the number of research products and their placement in professional, paywalled publication venues. The book could have achieved a genuine social impact and popular reception, but is devalued precisely because of the greater popular accessibility of books.

Grant money received is also a measure of prestige in the REF, though humanities research rarely receives large grants. The scientific research wing of a military contract could win an enormous grant because such research is extremely expensive. A humanities research project requires only a library and time. It doesn’t need the money, but it’s devalued on official impact matrices because it doesn’t receive the money that it can’t justify using.

University research is being removed from the public interest and literally dehumanized because its institutional priorities have been reshaped along a model of the private sector. It was private sector norms, not state governance and support, that ushered in a regime that marginalizes and delegitimizes the most singularly individual research in the name of scoring high on statistical impact frameworks. 

But the new liberal right wing whose ideas dominate so much of our political conversation doesn’t perceive this hypocrisy. There’s an even stranger paradox in new liberal thinking that I discovered reading Hayek of which this hypocrisy is an expression. That’s the neoliberal outrage that you can sociologically understand the production of knowledge itself. To be continued . . .

No comments:

Post a Comment