Exactly. Even if people don't publish information for money, a lot of them do it for "glory" for lack of a better term. Many people like being the "go to expert" in some particular field.
LLMs do away with that. 95% of folks aren't going to feel great if all of the time spent producing content is then just "put into the blender to be churned out" by an LLM with no traffic back to the original site.
chatGPT puts trillions of tokens into human heads per month, and collects extensive logs of problem solving and outcomes of ideas tested there. This is becoming a new way to circulate experience in society. And experience flywheel. We don't need blogs, we get more truthful and aligned outcomes from humna-AI logs.
Blogs have the enormous advantage of being decentralized and harder to manipulate and censor. We get "more truthful and aligned outcomes" from centralized control only so long as your definition of "truth" and "alignment" match the definitions used by the centralized party.
I don't have enough faith in Sam Altman or in all current and future US governments to wish that future into existence.
But it would disincentive those who create knowledge?
AFAIK, most of the highly specific knowledge comes from a small communities where shared goal and socialization with like-minded individuals are incentive to keep acquiring and describing knowledge for community-members.
Would it really be helpful to put an AI between them?
Second issue: who decides the weights of sources. this is the reason why every nation must have culturally aligned AIs defending their ways of living in the information sphere.
LLMs do away with that. 95% of folks aren't going to feel great if all of the time spent producing content is then just "put into the blender to be churned out" by an LLM with no traffic back to the original site.