Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok
-
Thanks I've never heard of data. And I've never read an academic text either. Condescending pos
So, while I'm ironing out your logic for you, "what else would you rely on, if not logic, to prove or disprove and ascertain knowledge about gaps?"
You asked a question, I gave an answer. I’m not sure where you get “condescending” there. I was assuming you had read an academic text, so I was hoping that you might have seen those patterns before.
You would look at the data for gaps, as my answer explained. You could use logic to predict some gaps, but not all gaps would be predictable. Mendeleev was able to use logic and patterns in the periodic table to predict the existence of germanium and other elements, which data confirmed, but you could not logically derive the existence of protons, electrons and neutrons without the later experimentations of say, JJ Thompson and Rutherford.
You can’t just feed the sum of human knowledge into a computer and expect it to know everything. You can’t predict “unknown unknowns” with logic.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::"We'll fix the knowledge base by adding missing information and deleting errors - which only an AI trained on the fixed knowledge base could do."
-
Wasn't he the children's author who published the book about a talking animals learning the value of hard work or something?
That'd be esteemed British author Georgie Orrell, author of such whimsical classics as "Now the Animals Are Running The Farm!", "My Big Day Out At Wigan Pier" and, of course, "Winston's Zany Eighties Adventure".
-
Seconds after the last human being dies, the Wikipedia page is updated to read:
Humans (Homo sapiens) or modern humans were the most common and widespread species of primate
And then 30 seconds after that it'll get reverted because the edit contains primary sources.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::Spoiler: He's gonna fix the "missing" information with MISinformation.
-
Spoiler: He's gonna fix the "missing" information with MISinformation.
She sounds Hot
-
She sounds Hot
She’s unfortunately can’t see you because of financial difficulties. You gotta give her money like I do. One day, I will see her in person.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::I wonder how many papers he's read since ChatGPT released about how bad it is to train AI on AI output.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::Humm....this doesn't sound great
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::Delusional and grasping for attention.
-
That's not what I said. It's absolutely dystopian how Musk is trying to tailor his own reality.
What I did say (and I've been doing AI research since the AlexNet days...) is that LLMs aren't old school ML systems, and we're at the point that simply scaling up to insane levels has yielded results that no one expected, but it was the lowest hanging fruit at the time. Few shot learning -> novel space generalization is very hard, so the easiest method was just take what is currently done and make it bigger (a la ResNet back in the day).
Lemmy is almost as bad as reddit when it comes to hiveminds.
You literally called it borderline magic.
Don't do that? They're pattern recognition engines, they can produce some neat results and are good for niche tasks and interesting as toys, but they really aren't that impressive. This "borderline magic" line is why they're trying to shove these chatbots into literally everything, even though they aren't good at most tasks.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::Grandiose delusions from a ketamine-rotted brain.
-
So just making shit up.
Don't forget the retraining on the made up shit part!
-
I mean, this is the same guy who said we'd be living on Mars in 2025.
In a sense, he's right. I miss good old Earth.