The disadvantages of online conversations are well-known and can largely be reduced to the fallout that attends on a lack of physical presence — no reliable way to convey tone, an over-aggressiveness that physical presence would automatically allay, a tendency to overreact in a setting where false impressions can only be dispelled after the one with false impressions has already written a great deal based on and invested a great deal of emotional energy in said impressions, etc., etc. Yet the (potential) advantages are manifold as well: the ability to respond at greater length than is possible in live conversation, the possibility for time-lags wherein one can actually think, etc.
What is less discussed in assessments of the relative merits of online conversation are the very real disadvantages of in-person conversation, to wit: the necessity of communicating in relatively short bursts, the lack of clarity that attends impromptu formulations of ideas, the influence of physical presence in causing everyone to try to avoid sharp disagreement and maintain an artificial comity that keeps conversations from advancing, etc., etc. And there’s also the fact that in order to benefit from these conversations, you have to be physically present at a given time and place. In-person conversations are, in short, no utopia! They can be good, but they can also be a waste of time — or at the very least, they can be dissatisfying, as social pressures of various kinds keep people from getting to the heart of the issue.
One of the most amazing innovations to occur in recent years is the microblogging platform Twitter, which quickly became a way for academics to exchange ideas. What is so remarkable about this technology is the way that it rigorously combines the worst features of both online and in-person communication without any of their benefits — and adds new deficits of its own. In terms of the deficits of online discourse, the 140-character limitation is well-known and obviously militates against clear expression and adequate conveyance of tone; the necessity to waste characters with “@” references to one’s conversation partners only exacerbates this effect. As a result, many conversations devolve into clarification, apology, etc., much more quickly than blog comment threads — something I previously thought to be impossible.
On the side of in-person conversations, Twitter conversations are nearly impossible to follow unless one is “there” (i.e., checking Twitter very frequently) while the conversation is going on — but the innovation Twitter adds here is that even if you are “there,” you can’t “see” or “hear” everyone involved in a given conversation unless you follow them (or are permitted to see the feeds of people who have private accounts). It’s like going to the pub and trying to converse with people who are also conversing with ghosts. And if you’re reconstructing the conversation after the fact, you wind up reading it in reverse, meaning that you are likely to hit the recriminations and apologies before you get to the substance they were attempting (apparently unsuccessfully) to discuss. This is not very promising in terms of motivation to pursue the issue further.
In short, it seems to me that Twitter is rigorously the worst possible venue to attempt to have a conversation of any but the most rudimentary and logistical kind. This is understandable, given that the service was designed as an easy way to send mass text messages — for instance, about your location or plans for the evening. That it has been forced into other uses is unfortunate; that it is ill-suited for these uses is unsurprising. In a perfect world, academics would be using it to coordinate pub meet-ups at conferences, not to discuss how much OOO sucks or whatever else.
That being said, I am an avid user of Twitter for two purposes:
- General procrastination — which, if we’re going to be honest, is a necessary component of academic work of all kinds
- Isolated witticisms — essentially, I use it as an outlet for half-formed ideas that I attempt to present in a clever or striking way, to avoid the temptation to clutter the blog with posts on topics that “aren’t ready for prime time”
[Note: This post grew out of a conversation I had, in person, with Ryan Krahn.]