I disagree. The focus of the article is misinformation. It’s literally the first sentence:
The latest model of ChatGPT has begun to cite Elon Musk’s Grokipedia as a source on a wide range of queries, including on Iranian conglomerates and Holocaust deniers, raising concerns about misinformation on the platform.
Yes, Grokipedia is right-wing. It was literally created to alter reality and spread lies that agree with their worldview! But the real problem is it can’t be edited with sourced, fact-based information, instead AI generates everything. I think the article did explore the fact that it’s one LLM depending on another…
The whole article is fixated on Grok being far right and never seems to care that an LLM is citing another LLM instead of an actual source.
I disagree. The focus of the article is misinformation. It’s literally the first sentence:
Yes, Grokipedia is right-wing. It was literally created to alter reality and spread lies that agree with their worldview! But the real problem is it can’t be edited with sourced, fact-based information, instead AI generates everything. I think the article did explore the fact that it’s one LLM depending on another…
If there was ever a difference between being far-right and being disinformation, there isn’t one anymore.
Half the web is going to be another llm soon