Skip to content

Companies That Tried to Save Money With AI Are Now Spending a Fortune Hiring People to Fix Its Mistakes

Technology
108 80 0
  • How social media became a storefront for deadly fake pills

    Technology technology
    1
    1
    18 Stimmen
    1 Beiträge
    2 Aufrufe
    Niemand hat geantwortet
  • 431 Stimmen
    19 Beiträge
    5 Aufrufe
    M
    I think they meant 'because'
  • Amazon Workers Defy Dictates of Automation

    Technology technology
    5
    1
    84 Stimmen
    5 Beiträge
    27 Aufrufe
    T
    The amount of times the shit breaks down combined with the slower speeds means it doesn't really matter if they work 24/7 right now. Yes, robots are coming, but amazon has been acting like they will be here tomorrow since it's inception. The reality is robots that cost less than people that at least do comparable work in the same time frame is still a decade or 2 away optimistically. Amazon trying to force it doesn't change that. Amazon is to robots what meta is to vr. Dumping tons of money trying to force the 'future' today.
  • How LLMs could be insider threats

    Technology technology
    12
    1
    105 Stimmen
    12 Beiträge
    54 Aufrufe
    patatahooligan@lemmy.worldP
    Of course they're not "three laws safe". They're black boxes that spit out text. We don't have enough understanding and control over how they work to force them to comply with the three laws of robotics, and the LLMs themselves do not have the reasoning capability or the consistency to enforce them even if we prompt them to.
  • 110 Stimmen
    84 Beiträge
    166 Aufrufe
    T
    It's not new technology you numpty. It's not news. It's not a scientific paper. Wireless energy transfer isn't "bullshit", it's been an understood aspect of physics for a long time. Since you seem unable to grasp the concept, I'll put it in bold and italics: This is a video of a guy doing a DIY project where he wanted to make his setup as wireless as possible. In the video he also goes over his thoughts and design considerations, and explains how the tech works for people who don't already know. It is not new technology. It is not pseudoscience. It is a guy showing off his bespoke PC setup. It does not need an article or a blog post. He can post about it in any form he wants. Personally, I think showcasing this kind of thing in a video is much better than a wall of text. I want to see the process, the finished product, the tools used and how he used them.
  • 24 Stimmen
    4 Beiträge
    20 Aufrufe
    S
    Said it the day Broadcom bought them, they're going to squeeze the smaller customers out. This behavior is by design.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    78 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • Things at Tesla are worse than they appear

    Technology technology
    34
    1
    420 Stimmen
    34 Beiträge
    115 Aufrufe
    halcyon@discuss.tchncs.deH
    [image: a4f3b70f-db20-4c1d-b737-611548cf3104.jpeg]