We live in the age of the present participle. Twittering introduces a new mode of writing, in which I am doing something. The Facebook 'status update' had a mildly static element to it - 'I am on holiday' - but has been replaced by the irritatingly invasive invitation to report 'what's on your mind?'. In the brewing anxiety about where this is all leading, nobody has yet begun a campaign to save the past participle. Will today's children be able to either write or read novels? Not without the past participle they won't.
In a sociological chain that even he can't entirely understand, Nigel Thrift writes of how the collapse of fixed currency exchange rates in the 1970s led to the rise of the present participle in capitalism. Capitalism is now constant becoming, striving, arriving, nearly departing. A waste collection van will now have the words "making London cleaner" plastered down the side. Policies are all 'making Britain' into something else. Twitter might be successfully implemented here, so as to reveal this becoming in real time.
But what of getting something done? I recently happened to eves-drop on two workplace discussions, as follows:
[Cafe, Bloomsbury. Two mobile knowledge workers are discussing a powerpoint presentation they are to give to a potential client. One appears to be mildly more senior than the other]
"he's not going to read this properly... the main thing is he realises that we've thought about it"
"shall we put a link in?"
"no... but the follow-up could be this - which is the link."
"transaction to interaction - DASH... I love the wording. The formatting needs work. But they must be able to see this, it's obvious... I think they will see it."
"Couple of concluding points... we believe we are the user experience, permeating a lot of other clusters"
"by fuelling... by supplying... (by expanding the access?) technologies with data - something like that? By creating reusable web resources... "
[Seating in publicly-funded cultural institution. Two members of staff are speaking loudly next to me]
"OK, well i'm glad we had this conversation."
" The value is in the use..."
" OK, well glad we had this conversation"
"I agree. I just thought..."
"I don't know. I think it's good that we discuss this"
"Yes. We should"
" Well i'm glad we discussed this. I just thought."
Contemporary work involves a desparate striving to put things in the past, to make them complete. In the first example, machismo and self-assurance is used to convince the practitioners that their task is becoming complete, but evidently this is sheer bravado. Such a task need never be complete. In the second, the conversation could go round and round in circles until I punched one of them. Only unseemly Taylorist power strategies - "time to stop talking!" - can save such employees from the tyranny of the present participle.
The problem is that language and grammar are both the form and the content of so much 21st century industry. We struggle to recognise the distinction between the means and the ends, the production method and the product. So we just keep on talking.
Throw ubiquitous real-time networks into the mix, and we could soon be completely fucked. Our capacity to produce a product is being dissolved in a miasma of producing. Why do the designers of technology not care? Why is the critical economic distinction between action and outcome, subject and object, not being defended by our digital architects? Are they all too busy twittering to notice? Allow me to set the ball rolling with some principles:
- Asynchronicity is a feature: Cory Doctorow's advice on 'writing in the age of distraction' is worth reading for anyone suffocated by the present participle. On reading it I removed three different forms of 'alert' from my desk top, each of which potentially broke my concentration. Previously I had also cancelled email alerts from facebook. Distracted contains some frightening data about how switching between different forms of communication weakens our capacity to engage with any of them intelligently. So if synchronicity is harmful to productivity, why is nobody examining ways of deliberately designing asynchronicity into new digital technologies? Why can I not opt out of the synchronous aspects of Facebook, say, but retain the events and information services? A particular form of Taylorism is required here, simply to cut up chunks of working time. But current trends seem to be all about the reverse.
- Erradication is a feature: gmail's most brilliant feature is 'archive' which equates to 'done' (or 'not my problem'). The most important consequence of limitless data storage is the greater capacity to remove it. Got rid, deleted, removed, archived, eliminated. These are satisfying and important past participles. They signal the mortality of data in an age of immortal projects. Lets produce information, nurture it, then murder it. Another geek tip: Danny Obrien discovered that most geeks organise their work using a lowly text file with a to-do list on it. I do exactly this for my own work, precisely so I can delete stuff from it when it's over (the key is to frame the tasks in ways that invite a past participle, e.g. 'go to dentist', not 'deal with dental hygiene'). This is very different from a Chris Anderson ideology, in which the achievement of our age is to have everything, everywhere, all the time. Lets have more murder weapons built into our computers.
Marx viewed capital as a vampire - the dead labour of previous generations, rising from its grave to oppress the living. The problem with language in the age of infinite computing is that it never dies at all. We never leave it behind, mourn it or escape from it. Instead, we have to limit it and kill it. In the analogue age, this was unpopular for various reasons. Limiting access sounds like censorship; killing information sounds like book-burning. But failure to come up with a new critique of digital communication means we can get stuck rearranging fonts on powerpoint slides, agreeing that we need to have the conversation we're currently having, and twittering about twitter.