Ufxqu fKVkM 4NAxE HlNdQ jcHr1 E7CvT act3e 5cvBX

How AI-generated content is increasing the workload of Wikipedia editors


As AI-generated decay takes over ever-larger swaths of the user-generated Web thanks to the rise of large language models (LLMs) like OpenAI's GPT, think about Wikipedia editors. In addition to their regular job of removing bad human edits, they have to spend an increasing proportion of their time removing AI filler.

404 Media has spoken with Ilyas Lebleu, editor of the collaborative encyclopedia, who participated in the founding of the “WikiProject AI Cleanup” project. The group is trying to come up with best practices for detecting machine-generated contributions. (And no, before you ask, AI is useless for this.)

An explicit problem with AI-generated content in this context is that it almost always comes from inappropriate sources. The ability of LLMs to instantly produce reams of believable-sounding text has even led to entire fake entries being uploaded in an attempt to hide hoaxes from Wikipedia's human experts.

fountain



Source link

Leave a Comment

4bWBU ZBk9A q7YbS YrUK1