The Usenet Legacy
A decade+ ago, most “online” comments were conceived and birthedin feature-rich, fat-client applications. These were toolsthat generally offered a rich gamut of functionality:spell-checking, automatic intelligent threading, offlinecomposition, selective content blocks (such as plonking unliked trolls,censoring expletives), automatic notification of certain keywordsor topics, alongside a wide breadth of additional capabilities.
You could read and participate in conversations on amassive array of topics, from law and order, to productsupport forums for a particular vendor’s database product, to theseedier side of the alt hierarchy. All using the sameclient application that you were comfortable with, configured justthe way you liked it.
After authoring your brilliant, convincing argument (or yourquestion about what video card to buy or how to call a certain APIfunction) and hitting send, the application would queue it up muchlike an outgoing email, and when the opportunity arose (when youdialed up to your local BBS), it would send it to your local servervia a standardprotocol, where it would be shared with a decentralizeduniverse of servers.
Usually your brilliant literary gem would be immediately visibleto the world — limited only by the rate of propagation– though a small number of newsgroups had post moderationthat requiring each new addition to first be approved.
The Advantage of Standards
This standardized protocol, message format, anddistribution mechanism allowed for rich client functionalitywithout reinventing it for every single newsgroup. Imagine howabsurd it would have been if you used a differentset of tools, with a different set of functions, to interact withcomp.sys.ibm.pc.hardware.video than you did withcomp.sys.ibm.pc.hardware.sound?
Just as importantly, the standard message format and transportprotocol allowed for very easy indexing andarchiving — easily searchable across time and space bywhichever search vendor did the best job. This is how we got theincredible functionality of DejaNews (which waslater purchased by Google and rebranded as Google Groups), whichmanaged to reach its indexing fingers back a decade earlier than itwas even imagined.
If you do software development, you’ve probably found newsgroupsto be by far the most useful resource to search when looking foranswers: While a normal web search will yield thousands of noiseresponses and pay sites begging for money to see the answer (thatthey usually ripped from a usenet newsgroup), a quick tab over tothe groups will usually immediately find the archive ofsomeone who faced a similar question or problem, and the helpfulreplies.
Of course Usenet is still around and very much alive,and some sites still use NNTP. Unfortunately the quantity of usefulanswers has been declining, or at least that’s my impression, asmore and more conversations are being siphoned off into poorlystructured, often unindexed islands of information.
Why is every new web app creating yet another terriblereinvention of a container for discussions Why are we functionallystepping back 20 years for every single new forum (see Digg,YouTube, Reddit, and others for examples of colosally brokendiscussion systems that people interact with despite their enormousfailures, having no alternatives. There are a few, Slashdot forinstance, that are moderately evolved, but it took half a decade toachieve a somewhat usable system, and even then the failings arenumerous)?
Worse still, why are so many sites storing conversations andthreads in isolated silos of data, stored and communicated incompletely non-standardized ways. I can easily find and referencethreads that I reminisce reading on a usenet newsgroups 14 yearsago (usually for “I told you so!” purposes), yet it’s oftenimpossible to find a thread or comment on a modern web forum evenif I remember seeing it a month ago.
This isn’t an argument for a return to the days of yore, and I’mcandy-coated the history and usability of Usenet, but itdoes seem like a lot of people are continually reinventingthe wheel, ignoring the lessons of the past.
It does seem like the value of each additional piece being addedto the global solution set is being diminished or completely lost:Where once we had clearly defined domains of information, clearlydeliminated and indexed by topic, with a clear threadingorganization and meta-data structure (author, date, what othercomment entry it’s a reply to, and so on) that could easily beinterpreted by anyone who understood the NNTP spec, now we’re atthe point where search engines have to try to interpret a millionvariations of rendering engines, inevitably losing most context andmetadata, and that’s only if they happen to even crawl across theconversations in the first place.
Somehow we need to find a medium, taking from the past whileincorporating modern technology. Perhaps a new embedded commentingstructure should be an addition to Firefox 3.
The Ultimate Goal
- Standard message structure and accessibility for archiving andindexing. Deja News provided an incredible example of the valuethis brought to the table.
- Standardized authoring tools and structure – a threadeddiscussion forum has almost exactly the same needs as every otherthreaded discussion forum. Users spend so much time authoringcomments that it is remarkable that we haven’t long had a rich<comment></comment> HTML element as a supersizedTEXTAREA, supporting all of the nuances and features shared byvirtually every conversation site.