Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 11:13 zuletzt editiert vonElon Musk, like most pseudo intellectuals, has a very shallow understanding of things. Human knowledge is full of holes, and they cannot simply be resolved through logic, which Mush the dweeb imagines.
-
Just because Wikipedia offers a list of references doesn't mean that those references reflect what knowledge is actually out there. Wikipedia is trying to be academically rigorous without any of the real work. A big part of doing academic research is reading articles and studies that are wrong or which prove the null hypothesis. That's why we need experts and not just an AI to regurgitate information. Wikipedia is useful if people understand it's limitations, I think a lot of people don't though.
schrieb am 23. Juni 2025, 11:18 zuletzt editiert vonFor sure, Wikipedia is for the most basic subjects to research, or the first step of doing any research (they could still offer helpful sources) . For basic stuff, or quick glances of something for conversation.
-
That was my first impression, but then it shifted into "I want my AI to be the shittiest of them all".
schrieb am 23. Juni 2025, 11:33 zuletzt editiert vonWhy not both?
-
asdf
schrieb am 23. Juni 2025, 11:35 zuletzt editiert vonSo what would you consider to be a trustworthy source?
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 11:37 zuletzt editiert vonHe's been frustrated by the fact that he can't make Wikipedia 'tell the truth' for years. This will be his attempt to replace it.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 12:04 zuletzt editiert vonI never would have thought it possible that a person could be so full of themselves to say something like that
-
The plan to "rewrite the entire corpus of human knowledge" with AI sounds impressive until you realize LLMs are just pattern-matching systems that remix existing text. They can't create genuinely new knowledge or identify "missing information" that wasn't already in their training data.
schrieb am 23. Juni 2025, 12:16 zuletzt editiert vonTo be fair, your brain is a pattern-matching system.
When you catch a ball, you’re not doing the physics calculations in your head- you’re making predictions based on an enormous quantity of input. Unless you’re being very deliberate, you’re not thinking before you speak every word- your brain’s predictive processing takes over and you often literally speak before you think.
Fuck LLMs- but I think it’s a bit wild to dismiss the power of a sufficiently advanced pattern-matching system.
-
I keep a partial local copy of Wikipedia on my phone and backup device with an app called Kiwix. Great if you need access to certain items in remote areas with no access to the internet.
schrieb am 23. Juni 2025, 12:26 zuletzt editiert vonThey may laugh now, but you're gonna kick ass when you get isekai'd.
-
For sure, Wikipedia is for the most basic subjects to research, or the first step of doing any research (they could still offer helpful sources) . For basic stuff, or quick glances of something for conversation.
schrieb am 23. Juni 2025, 12:29 zuletzt editiert von warl0k3@lemmy.worldThis very much depends on the subject, I suspect. For math or computer science, wikipedia is an excellent source, and the credentials of the editors maintaining those areas are formidable (to say the least). Their explanations of the underlaying mechanisms are in my experience a little variable in quality, but I haven't found one that's even close to outright wrong.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 13:32 zuletzt editiert vonUnironically Orwellian
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 13:57 zuletzt editiert vonI'm just seeing bakes in the lies.
-
Wikipedia is not a trustworthy source of information for anything regarding contemporary politics or economics.
Wikipedia presents the views of reliable sources on notable topics. The trick is what sources are considered "reliable" and what topics are "notable", which is why it's such a poor source of information for things like contemporary politics in particular.
schrieb am 23. Juni 2025, 14:30 zuletzt editiert von aaron@infosec.pubasdf
-
Wikipedia is not a trustworthy source of information for anything regarding contemporary politics or economics.
Wikipedia presents the views of reliable sources on notable topics. The trick is what sources are considered "reliable" and what topics are "notable", which is why it's such a poor source of information for things like contemporary politics in particular.
schrieb am 23. Juni 2025, 14:33 zuletzt editiert von aaron@infosec.pubasdf
-
I never would have thought it possible that a person could be so full of themselves to say something like that
schrieb am 23. Juni 2025, 14:37 zuletzt editiert vonAn interesting thought experiment: I think he's full of shit, you think he's full of himself. Maybe there's a "theory of everything" here somewhere. E = shit squared?
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 15:04 zuletzt editiert vonI remember when I learned what corpus meant too
-
Elon Musk, like most pseudo intellectuals, has a very shallow understanding of things. Human knowledge is full of holes, and they cannot simply be resolved through logic, which Mush the dweeb imagines.
schrieb am 23. Juni 2025, 15:12 zuletzt editiert vonUh, just a thought. Please pardon, I'm not an Elon shill, I just think your argument phrasing is off.
How would you know there are holes in understanding, without logic. How would you remedy gaps of understanding in human knowledge, without the application of logic to find things are consistent?
-
asdf
schrieb am 23. Juni 2025, 15:12 zuletzt editiert vonAgain, read the rest of the comment. Wikipedia very much repeats the views of reliable sources on notable topics - most of the fuckery is in deciding what counts as "reliable" and "notable".
-
Again, read the rest of the comment. Wikipedia very much repeats the views of reliable sources on notable topics - most of the fuckery is in deciding what counts as "reliable" and "notable".
schrieb am 23. Juni 2025, 15:15 zuletzt editiert von aaron@infosec.pubasdf
-
Uh, just a thought. Please pardon, I'm not an Elon shill, I just think your argument phrasing is off.
How would you know there are holes in understanding, without logic. How would you remedy gaps of understanding in human knowledge, without the application of logic to find things are consistent?
schrieb am 23. Juni 2025, 15:31 zuletzt editiert von andros_rex@lemmy.worldYou have to have data to apply your logic too.
If it is raining, the sidewalk is wet. Does that mean if the sidewalk is wet, that it is raining?
There are domains of human knowledge that we will never have data on. There’s no logical way for me to 100% determine what was in Abraham Lincoln’s pockets on the day he was shot.
When you read real academic texts, you’ll notice that there is always the “this suggests that,” “we can speculate that,” etc etc. The real world is not straight math and binary logic. The closest fields to that might be physics and chemistry to a lesser extent, but even then - theoretical physics must be backed by experimentation and data.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::schrieb am 23. Juni 2025, 15:32 zuletzt editiert vonThe thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.
Sources (unordered):
- What is model collapse?
- AI models collapse when trained on recursively generated data
- Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
- Collapse of Self-trained Language Models
Whatever nonsense Muskrat is spewing, it is factually incorrect. He won't be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.
-
Instagram now requires users to have at least 1,000 followers to go live | TechCrunch
Technology199 vor 10 Tagenvor 10 Tagen1
-
-
Grok AI to be available in Tesla vehicles next week, Elon Musk says
Technology199 vor 30 Tagen10. Juli 2025, 11:301
-
Threads is nearing X's daily app users, new data shows
Technology199 vor 23 Tagen8. Juli 2025, 13:271
-
DeepSeek accused of powering China’s military and mining US user data
Technology 24. Juni 2025, 12:581
-
Senators Call for The FTC to Launch an Investigation into Spotify for Forcing Subscribers into Higher-Priced Subscriptions Without Their Consent.
Technology 23. Juni 2025, 16:581
-
Chinese tech firms freeze AI tools in national crackdown on exam cheats
Technology 9. Juni 2025, 09:011
-
Nvidia debuts a native GeForce NOW app for Steam Deck, supporting games in up to 4K at 60 FPS; in testing, the app extended Steam Deck battery life by up to 50%
Technology 29. Mai 2025, 14:251