Skip to content

Half of companies planning to replace customer service with AI are reversing course

Technology
171 102 0
  • Science and Technology News and Commentary: Aardvark Daily

    Technology technology
    2
    7 Stimmen
    2 Beiträge
    0 Aufrufe
    I
    What are you on about with this? Last news post 2013?
  • 57 Stimmen
    5 Beiträge
    0 Aufrufe
    avidamoeba@lemmy.caA
    [image: c1b6d049-afed-4094-a09b-5af6746c814f.gif]
  • Unionize or die - Drew DeVault

    Technology technology
    3
    75 Stimmen
    3 Beiträge
    3 Aufrufe
    W
    and hopefully also elsewhere. as Drew said in the first part, tech workers will be affected by billionaire's decisions even outside of work, on multiple fronts. we must eat the rich, or they will eat us all alive.
  • How the US is turning into a mass techno-surveillance state

    Technology technology
    66
    1
    483 Stimmen
    66 Beiträge
    16 Aufrufe
    D
    Are these people retarded? Did they forget Edward Snowden?
  • Whatever happened to cheap eReaders? – Terence Eden’s Blog

    Technology technology
    72
    1
    125 Stimmen
    72 Beiträge
    5 Aufrufe
    T
    This is a weirdly aggressive take without considering variables. Almost petulant seeming. 6” readers are relatively cheap no matter the brand, but cost goes up with size. $250 to $300 is what a 7.8” or 8” reader costs, but there’s not a single one I know of at 6” at that price. There’s 10” and 13” models. Are you saying they should cost the same as a Kindle? Not to mention, regarding Kindle, Amazon spent years building the brand but selling either at cost or possibly even taking a loss on the devices as they make money on the book sales. Companies who can’t do that tend to charge more. Lastly, it’s not “feature creep” to improve the devices over time, many changes are quality of life. Larger displays for those that want them. Frontlit displays, and later the addition of warm lighting. Displays essentially doubled their resolution allowing for crisper fonts and custom fonts to render well. Higher contrast displays with darker blacks for text. More recently color displays as an option. This is all progress, but it’s not free. Also, inflation is a thing and generally happens at a rate of 2% to 3% annually or thereabouts during “normal” times, and we’ve hardly been living in normal times over the last decade and a half.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    2 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • You Can Choose Tools That Make You Happy

    Technology technology
    1
    1
    30 Stimmen
    1 Beiträge
    2 Aufrufe
    Niemand hat geantwortet
  • Stack overflow is almost dead

    Technology technology
    5
    0 Stimmen
    5 Beiträge
    2 Aufrufe
    ineedmana@lemmy.worldI
    students When I was a student I despised the idea of typeless var in C#. Then a few years later at my day job I fully embraced C++ auto. I understand the frustration but unfortunately being wrong is part of learning