One of the most disruptive things that has happened during Web 2.0 is Wikipedia – displacing the Encyclopedia Britannica as an online resource, forging strategic partnerships, and – for better and worse – the editorial community.
It has become one of the more dependable sources of information on the Internet, and while imperfect, the editors have collectively been a part of an evolution of verification and quality control that has made Wikipedia a staple.
It apparently has also been part of the training models of the large language models that we have grown to know over the past months, such as ChatGPT and Google’s Bard, which is interesting given how much volunteer work went into creating Wikipedia – something that makes me wonder if Wikimedia could be a part of the lawsuit.
This is pure speculation on my part, but given how much collective effort has gone into the many projects of Wikimedia, and given it’s mission is pretty clear about bringing free educational content to the world, large language models charging subscribers on that content is something that might be worth a bit of thought.
On a conference call in March that focused on A.I.’s threats to Wikipedia, as well as the potential benefits, the editors’ hopes contended with anxiety. While some participants seemed confident that generative A.I. tools would soon help expand Wikipedia’s articles and global reach, others worried about whether users would increasingly choose ChatGPT — fast, fluent, seemingly oracular — over a wonky entry from Wikipedia. A main concern among the editors was how Wikipedians could defend themselves from such a threatening technological interloper. And some worried about whether the digital realm had reached a point where their own organization — especially in its striving for accuracy and truthfulness — was being threatened by a type of intelligence that was both factually unreliable and hard to contain.
One conclusion from the conference call was clear enough: We want a world in which knowledge is created by humans. But is it already too late for that?
John Gertner, “Wikipedia’s Moment of Truth“, New York Times Magazine, July 18th, 2023, Updated on July 19th, 2023.
It is a quandary, that’s for sure. Speaking for myself, I prefer having citations on a Wikipedia page that I can research on my own – there seem to be at least some of us that trample our way through footnotes – and large language models don’t cite anything, which is the main problem I have with them.
In the facts category, I would say Wikipedia should win.
Unfortunately, time and again, the world has demonstrated that facts are sometimes a liability for selling a story, and so the concern I have is real.
Yet it could be useful to combine the two somehow.
When I was a kid, we had Encyclopedia Brittanica salesmen come to our homes. We had the massive set (both adult and children versions). The problem was it became dated quickly. That’s one grand thing about the internet, you can update fast and fix what is wrong. But, yes, citing references is key.
I had those Encyclopedia Brittanica’s as well at home – I used to get in trouble for reading them instead of doing my homework! 🙂