Thursday, April 24, 2008

The infomation can find us...

but is it what we want?

Michael Wesch gave an amazing keynote at the NITLE conference in early April. He's an Assistant Professor of Cultural Anthropology at Kansas State University, but has become somewhat well known outside of academia for a series of YouTube videos describing both information in the Web 2.0 world as well as the students who live there. See:

Web 2.0 ... The Machine is Us/ing Us
Information R/evolution
A Vision of Students Today

He is simply an amazing speaker, one of the few who can truly make you sit up and watch an entire PowerPoint and enjoy it. These skills clearly translate into classroom popularity as well: his courses have long waiting lists and he makes students apply for his upper division courses. He's thought deeply about the way that Web2.0 applications change some of the information dynamics in modern education: information is not hard to find anymore. Information does not need to be indexed by experts. Information and its physical form have no relation. Information can even find us in the form of RSS feeds and bots.

Fascinating stuff. But step back and look a bit more: I think the experts might have the trump card after all.

Yes, information is trivial to find. Google has seen to that. Information can find us: RSS, Atom and the rest can feed us a constant stream of info on everything from current politics to fly fishing. The public as a whole will index the material, not experts. But is that really new?

Even in the Web2.0 world, information still has a place. It's not sitting on a bookshelf- it's sitting on a spinning hard disk on a server farm somewhere. It still has an address-not a call number, but a URL. The public can create links to it and index it for searching, but that's not really new: back in the good old days of Paper 1.0 we called those references. There's a reason scholarly papers and books have hundreds of footnotes- it's the Paper 1.0 version of linking. Reference works like the Science Citation Index are a public tagging facility for scholars.

Was information hard to find? Not really- any college grad should have the skills to search a journal for articles and to follow a reference/citation chain. Has Google made it faster and easier? Of course- I managed to find a paper on response rates in online course evaluations in an obscure British journal in about 20 minutes using Google the other day(Brit. J. Educ. Tech, V38, N6, 2007, 1085-1101), but it's not like it wasn't possible before. It just would have taken a few more hours of search time and then a wait for my library to request a copy of the article from another, but not impossible. (I'm still not going back to Paper 1.0)

What's changing is the democratization of this process. To get a journal article or book published is a significant undertaking, involving things like credential checks, peer review and the like. This is gradually getting replaced by thousands of people tagging, digging and otherwise marking any given article with their own set of keywords. Democratization is good, right?

I'm teaching a new course in the fall: the history of the world through materials science. Like Wesch, I can set up a home page that will feed interesting stories about materials in our world to a single location for my students. This will bring in lots of interesting topics to discuss: lead paint in toys, concrete fatigue in bridges, how mercury in vaccines causes autism and so forth. This will certainly help my students tremendously.

Except that study after study shows that mercury doesn't cause autism. There is an active, very loud group that despite all scientific evidence (and there's mountains of it) believes this link to be true and has written thousands of web pages detailing their beliefs. Doing a Google search for mercury will find these pages high up- lots of people link to them driving up pagerank, they are often dugg, etc. Yet to the best that serious research can show, it's simply not true. But people searching for this information won't find the careful, peer-reviewed studies about this, because they are long, hard to read and contain a lot of jargon. Even the newspaper articles that cite the journal articles are fairly dull- it's much more interesting to read about poor Stevie who suddenly changed personality after getting a vaccination and the trials his parents have to endure to care for him.

So the blogosphere gets this wrong, as it does a lot of science questions where a handful of loud mouths can distort the issue to make it appear that there is controversy where none really exists. How do we handle this properly? One way is to rely on experts to make the call about how an argument should be tagged, but then we're back to the Paper 1.0 days where the indexing is controlled by a small elite.

The information is out there. But which information is actually good?

No comments: