Skip to content

Trump Team Has Full Meltdown Over CNN Story on ICE-Tracking App

Technology
146 83 1
  • The effects of AI on firms and workers

    Technology technology
    4
    1
    9 Stimmen
    4 Beiträge
    0 Aufrufe
    brobot9000@lemmy.worldB
    Your response is: want to be more productive? Replace the CEO and pointless middle management with Ai! Image how much money the shareholders would save!
  • 0 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • Resurrecting a dead torrent tracker and finding 3 million peers

    Technology technology
    58
    321 Stimmen
    58 Beiträge
    43 Aufrufe
    M
    donating online Yeah i suppose any form of payment that you have to keep secret for some reason is a reason to use crypto, though I struggle to imagine needing that if you're not doing something dodgy avoiding scams for p2p transactions Wat. Crypto is not good at solving that, it's in fact much much worse than traditional payment methods. There's a reason scammers always want to be paid in crypto boycotting the banking system What specifically are you boycotting? The money that backs your crypto (i.e. that you bought it with) still sits in a bank account somewhere and continues to support the banks. All you're boycotting then are payments, but those are usually free for consumers (many banks lose money on them) so you're not exactly "sticking it to the man" by not using them. Evem if you were somehow hurting banks by using crypto, if you think the people that benefit from you using crypto (crypto exchange owners and billionaires that own crypto etc.) are less evil than goverment regulated banks, you're deluded. What about avoiding international payment fees? You'll spend more money using crypto for that, not less
  • 284 Stimmen
    134 Beiträge
    37 Aufrufe
    I
    I'm not afraid of that at all. But if you draw shit tons of power from a crappy socket, things start to heat up real quick. Like getting really fucking hot, as in burn your house down hot.
  • 180 Stimmen
    13 Beiträge
    5 Aufrufe
    D
    There is a huge difference between an algorithm using real world data to produce a score a panel of experts use to make a determination and using a LLM to screen candidates. One has verifiable reproducible results that can be checked and debated the other does not. The final call does not matter if a computer program using an unknown and unreproducible algorithm screens you out before this. This is what we are facing. Pre-determined decisions that human beings are not being held accountable to. Is this happening right now? Yes it is, without a doubt. People are no longer making a lot of healthcare decisions determining insurance coverage. Computers that are not accountable are. You may have some ability to disagree but for how long? Soon there will be no way to reach a human about an insurance decision. This is already happening. People should be very anxious. Hearing United Healthcare has been forging DNRs and has been denying things like treatment for stroke for elders is disgusting. We have major issues that are not going away and we are blatantly ignoring them.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    20 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • 12 Stimmen
    7 Beiträge
    12 Aufrufe
    C
    Sure, he wasn't an engineer, so no, Jobs never personally "invented" anything. But Jobs at least knew what was good and what was shit when he saw it. Under Tim Cook, Apple just keeps putting out shitty unimaginative products, Cook is allowing Apple to stagnate, a dangerous thing to do when they have under 10% market share.
  • *deleted by creator*

    Technology technology
    4
    1
    0 Stimmen
    4 Beiträge
    13 Aufrufe
    O
    I feel like I'm in those years of You really want a 3d TV, right? Right? 3D is what you've been waiting for, right? all over again, but with a different technology. It will be VR's turn again next. I admit I'm really rooting for affordable, real-world, daily-use AR though.