Wikipedia Bans Use Of AI-Generated Content via @sejournal, @martinibuster
Wikipedia's new AI guidelines prohibit editors from using LLMs for writing or rewriting content, with two exceptions. The post Wikipedia Bans Use Of AI-Generated Content appeared first on Search Engine Journal.
Wikipedia published new guidelines prohibiting editors from using LLMs for writing or rewriting content, with two exceptions.
Wikipedia recently published guidelines prohibiting the use of AI to generate or rewrite articles, except for two exceptions related to editing and translations. The guidelines acknowledges that identifying AI generated content can’t be based on style signals and offers no further guidance on how they will identify the LLM-based content.
Violation Of Wikipedia’s Core Content Policies.
The new guidelines prohibiting the use of LLMs states that the use of AI violates several of their core content policies, without actually naming them. But a look at those policies makes it reasonably clear which policies are being alluded to, namely their policies about verifiability, their prohibition on no original research, and possibly their requirement for a neutral point of view are quite likely the two obvious policies referred to.
The policy on verifiability requires that content that might be challenged must be attributable to a reliable published source that other editors can check to verify that the source is reliable. LLMs generate text without explicitly citing sources and they also tend to hallucinate facts.
The policy on original research states:
“Wikipedia does not publish original thought: all material in Wikipedia must be attributable to a reliable, published source. Articles may not contain any new analysis or synthesis of published material that serves to advance a position not clearly advanced by the sources.”
Obviously, LLMs generate a synthesis based on published sources and as for neutral point of view, it’s possible for an LLM to place more weight on dominant viewpoints at the expense of those that are in a minority. Most SEOs are aware that asking an LLM about SEO consistently results in answers that reflect the dominant but not necessarily the most correct point of view.
The new guidance makes two exceptions:
“Editors are permitted to use LLMs to suggest basic copyedits to their own writing, and to incorporate some of them after human review, provided the LLM does not introduce content of its own. Caution is required, because LLMs can go beyond what you ask of them and change the meaning of the text such that it is not supported by the sources cited. Editors are permitted to use LLMs to translate articles from another language’s Wikipedia into the English Wikipedia, but must follow the guidance laid out at Wikipedia:LLM-assisted translation.”As to identifying AI generated content, the new Wikipedia AI guidelines suggest considering how well the content complies with their core content guidelines and to audit recent posts by the editor whose edits are under suspicion.
Featured Image by Shutterstock/JarTee
SEJ STAFF Roger Montti Owner - Martinibuster.com at Martinibuster.com
I have 25 years hands-on experience in SEO, evolving along with the search engines by keeping up with the latest ...
Koichiko