Being measured and reasonable is no way to make serious money if you're a public intellectual, but Chris Anderson's recent article 'The End of Theory' is so exaggerated as to become a monument worthy of analysis.
Anderson believes that the presence of unprecedented quantities of data on everything imaginable, combined with Google's ability to identify frequent internal linkages within that data, fundamentally alters the nature of cognition. From now on, there is no requirement for hypotheses, speculation or theory regarding the world, given that Google will answer all questions on the basis of commonalities of data. "Correlation supersedes causation," he argues, "and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all." His gravest threat relates to social science:
Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.
There is so much wrong with this, intellectually and politically, it's difficult to know where to start. One thing is worth noting at the outset, namely that Anderson is at least performatively consistent with his own argument - like Google, he clearly 'knows' very little about the various phenomena he discusses, but simply notes that many of them happen to cluster together. Yes, it is true that sociologists and philosophers speak to each other, share certain categories and concerns, dabble in each other's territory. But this doesn't mean they can be lumped together as 'theory'. Yes it is true that the philosophy of science and the history of science are informed by each other, and occasionally merge, as in the work of Ian Hacking or, arguably, Hegel. But this doesn't mean they are an integrated thing that can be lazily titled 'theory'.
It is on these misunderstandings that Anderson's argument depends. Crucially, Anderson misses the difference between those engaged in the Kantian epistemological project of seeking to understand how knowlege is possible as a cognitive artifice, and those engaged in the Weberian sociological project of seeking to understand how knowledge is handled using technological artifices. (A Latourian may well argue that we can do without this distinction, but you'd never know if they were joking or not). Ignorant of this distinction, Anderson comes to view epistemological and bureaucratic questions as interchangeable. He spots the fact that both are concerned with the artificial nature of reality, and that correlation is as much confirmation as he requires.
With this elision complete, 'theory' comes to appear dispensable, a 20th century relic not dissimilar to the fax machine, say. In our post-theory future, Google's mathematical algorithms perform the function that Kantian epistemology had granted cognition, namely of creating synthetic unities out of an over-whelming mass of sense data.
Of course this is non-sensical. Theory never ceases, because it is existential not technological. It resides in the fact of self-consciousness and the distinctly human ability (or desire) to view the world in a non-immediate, non-idiotic fashion. We have no option to abandon theory, only the option to pretend as much, as Anderson advocates here, and this is where the politics comes in.
I would suggest that Anderson is extending the Chicago School project of selectively dismantling the bases for authoritative knowledge claims. Chicago economics renders social knowledge so fragile and polluted with self-interest, that it becomes impossible to produce a better model for society than that of the unimpeded market. Again, there is a sleight of hand at work here - man's epistemological condition leads not just contingently but logically to the technological solution of the market. Readers of the Becker-Posner blog may note how often they get entangled in philosophical questions, and immediately reach for either technical or common-sense answers. I guess it's partly an outcrop of American pragmatism. And why is this a 'selective' dismantling? Because it doesn't dismantle the authority of the economist, nor of the aggregate knowledge contained in the market, but attacks all other forms of centralised expertise.
For Anderson it is not the market that comes to our rescue, but the world wide web. What the market can do for material resources, the web can do for knowledge. In each case, we are relieved of the political and theoretical burden of trying to produce a good, coherent model for society, and put ourselves in the hands of an ignorant, amoral mechanism - price in the case of material resources, algorithm in the case of immaterial ones. Obviously there are borderline cases between the two, which is why copyright is such a controversial issue for neo-liberals. But the myth of both Chicago School price theory and Anderson's algorithm theory is that the problems of political theory needn't be solved so much as evaporated, thanks to tools of knowledge aggregation.
The problem for both price theorists and (what I now term) algorithm theorists is that people are incapable of becoming as stupid and apolitical as these strategies require. Neo-liberal economists rage against the fact that the market is constantly distorted by the state, and is thereby prevented from confirming the claims they make for it. Likewise, the neo-liberal technologists will no doubt rage against the fact that Google is constantly distorted by professional scientists, who attempt to stipulate the facts drawing on bogus authority.
The analogy is a helpful one for exploring the threat Anderson poses. For instance, what would the equivalent of 'shock therapy' be for algorithm theorists? Perhaps developing countries would be advised to bulldoze all of their universities in one go, so as to move immediately to a Google-mediated society.
Finally, markets and Google both suffer from one severe defect: they are highly effective at identifying two separate things as the same, but very bad at specifying the difference between them. A market can tell you how many apples 'equals' one newspaper, but has nothing to say about the difference between biting into a braeburn and reading the news. Similarly, Google can unify a book-seller, a beer and a deceased Labour leader under the single category of 'John Smith', but where the differences become more subtle, it offers little assistance. Anderson may be right, if he is arguing that we have less work to do nowadays in performing acts of identification; but if he has forgotten that intelligence consists also in differentiation, then he is not just a bad social scientist and philosopher, but a sociopath. Then again, fifty years ago, people probably thought the same about Gary Becker...
I should first admit my vested interest: I'm a fully-paid up physicist by trade.
One issue I always have with these arguments is their mechanisms for refuting current scientific practice. Yes - the LHC is expensive and doesn't generate much for the market place; yes - genome research is expensive and doesn't generate much either (my guess). But there's huge tranches of research that do - semiconductor research gives the theory and the recipes for faster CPUs for your laptops, blue LEDs for every electronic device in your car/home/office, GMR storage for every iPod out there. I expect a similar situation could be painted for drug research. You cannot 'rip and replace' these classic research activities and try and correlate the best nanometre-thick insulator for silicon (hafnium-based - how would that get correlated?). You still need theory in order to develop the insight.
Posted by: MarkD | July 08, 2008 at 01:58 PM
The quintessential Chris Anderson phrase in the Wired piece is, 'The Petabyte age is different, because more is different.' Only he does it without the comma, as if it emphasise its utter meaninglessness.Then again, he thinks that computers are good at translating languages; so who needs meaning anyway?
Posted by: Kate | July 09, 2008 at 10:47 AM
Quite. But as I say - if somebody makes an argument that stupid is the new smart, it's hardly all that surprising when they linger in stupidity.
Posted by: Will Davies | July 09, 2008 at 11:18 AM
Interesting article. On your last para
"Finally, markets and Google both suffer from one severe defect: they are highly effective at identifying two separate things as the same, but very bad at specifying the difference between them."
You give the example of distinguishing price and experience (markets) & matching and meaning (Google). The market will never be able to do the former (or at least only to the degree that people value the experience differently), but Google are getting better at the latter. E.g. search for 'Turkey', and ten months of the year in the US and UK, you'll get results back about the country. But for two months of the year, you'll get recipes for cooking poultry. The matching is the same, but the meaning is different. Similarly with spelling mistakes - it ignores the matching, and tries to find the meaning. Their system is hardly perfect, but it is getting better.
Posted by: Richard | July 11, 2008 at 07:57 AM
"...people are incapable of becoming as stupid...as these strategies require."
ROTFLMAO
Are you sure? Have you been out in the world recently?
Posted by: pete | July 16, 2008 at 07:27 PM
Pete - how do imagine post-theoretical, post-political humanity? I can't imagine it at all. In my view, Anderson et al are just further examples of our inability to relinquish theory and politics.
Posted by: Will Davies | July 17, 2008 at 12:22 AM