How AI-generated content is increasing the workload of Wikipedia editors


mvi" srcset="mvi 640w, 9Tq 300w, dUg 768w, GH9 630w, 89f 681w, lAT 1024w" sizes="(max-width: 640px) 100vw, 640px" alt="" title="GettyImages-2174856928.jpg"/>

As AI-generated decay takes over ever-larger swaths of the user-generated Web thanks to the rise of large language models (LLMs) like OpenAI's GPT, think about Wikipedia editors. In addition to their regular job of removing bad human edits, they have to spend an increasing proportion of their time removing AI filler.

404 Media has spoken with Ilyas Lebleu, editor of the collaborative encyclopedia, who participated in the founding of the “WikiProject AI Cleanup” project. The group is trying to come up with best practices for detecting machine-generated contributions. (And no, before you ask, AI is useless for this.)

An explicit problem with AI-generated content in this context is that it almost always comes from inappropriate sources. The ability of LLMs to instantly produce reams of believable-sounding text has even led to entire fake entries being uploaded in an attempt to hide hoaxes from Wikipedia's human experts.

fountain



Y4f">Source link

Leave a Comment