Skip to content

CBDC Explained : Can your money really expire?

Technology
4 4 0
  • 110 Stimmen
    31 Beiträge
    0 Aufrufe
    O
    Thanks. Well, I don’t think OpenAI knows how to build AGI, so that’s false. Otherwise, Sam’s statement there is technically correct, but kind of misleading - he talks about AGI and then, in the next sentence, switches back to AI. Sergey’s claim that they will achieve AGI before 2030 could turn out to be true, but again, he couldn’t possibly know that. I’m sure it’s their intention, but that’s different from reality. Elon’s statement doesn’t even make sense. I’ve never heard anyone define AGI like that. A thirteen-year-old with an IQ of 85 is generally intelligent. Being smarter than the smartest human definitely qualifies as AGI, but that’s just a weird bar. General intelligence isn’t about how smart something is - it’s about whether it can apply its intelligence across multiple unrelated fields.
  • France considers requiring Musk’s X to verify users’ age

    Technology technology
    20
    1
    142 Stimmen
    20 Beiträge
    0 Aufrufe
    C
    TBH, age verification services exist. If it becomes law, integrating them shouldn't be more difficult than integrating a OIDC login. So everyone should be able to do it. Depending on these services, you might not even need to give a name, or, because they are separate entities, don't give your name to the platform using them. Other parts of regulation are more difficult. Like these "upload filters" that need to figure out if something shared via a service is violating any copyright before it is made available.
  • The FDA Is Approving Drugs Without Evidence They Work

    Technology technology
    69
    1
    507 Stimmen
    69 Beiträge
    10 Aufrufe
    L
    Now you hit me curious too. This was my source on Texas https://www.texasalmanac.com/place-types/town Also the total number of total towns is over 4,000 with only 3k unincorporated, I did get the numbers wrong even in Texas. I had looked at Wikipedia but could not find totals, only lists
  • My AI Skeptic Friends Are All Nuts

    Technology technology
    31
    1
    12 Stimmen
    31 Beiträge
    8 Aufrufe
    J
    I did read it, and my comment is exactly referencing the attitude of the author which is "It's good enough, so you should use it". I disagree, and say it's another dumbass shortcut to cash grab on a less than stellar ecosystem and product. It's training wheels for failure.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    2 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • 41 Stimmen
    5 Beiträge
    2 Aufrufe
    paraphrand@lemmy.worldP
    Network Effects.
  • 82 Stimmen
    3 Beiträge
    2 Aufrufe
    sfxrlz@lemmy.dbzer0.comS
    As a Star Wars yellowtext: „In the final days of the senate, senator organa…“
  • 6 Stimmen
    9 Beiträge
    3 Aufrufe
    V
    Ah yeah, that doesn't look like my cup of tea.