Skip to content

SpaceX's Starship blows up ahead of 10th test flight

Technology
141 100 0
  • 226 Stimmen
    52 Beiträge
    0 Aufrufe
    L
    Yeah, hence the qualifications haha
  • 57 Stimmen
    5 Beiträge
    2 Aufrufe
    avidamoeba@lemmy.caA
    [image: c1b6d049-afed-4094-a09b-5af6746c814f.gif]
  • Using A Videocard As A Computer Enclosure

    Technology technology
    5
    1
    86 Stimmen
    5 Beiträge
    3 Aufrufe
    T
    Back in the day there was a pic floating about where someone had put a micro atx board and psu into a standard PSU chassis into a standard PC case for a spectacular "empty case" mod
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    6 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • 13 Stimmen
    22 Beiträge
    6 Aufrufe
    T
    You might enjoy this blog post someone linked in another thread earlier today https://www.wheresyoured.at/the-era-of-the-business-idiot/
  • 148 Stimmen
    8 Beiträge
    2 Aufrufe
    L
    Whenever these things come up you always hear "then the company won't survive!" CEO and managers make bank somehow but it doesn't matter that the workers can't live on that wage. It's always so weird how when workers actually take a pay cut, that the businesses get used to it. When the CEOs get bonuses they have to get used to that too.
  • Meta Reportedly Eyeing 'Super Sensing' Tech for Smart Glasses

    Technology technology
    4
    1
    34 Stimmen
    4 Beiträge
    4 Aufrufe
    M
    I see your point but also I just genuinely don't have a mind for that shit. Even my own close friends and family, it never pops into my head to ask about that vacation they just got back from or what their kids are up to. I rely on social cues from others, mainly my wife, to sort of kick start my brain. I just started a new job. I can't remember who said they were into fishing and who didn't, and now it's anxiety inducing to try to figure out who is who. Or they ask me a friendly question and I get caught up answering and when I'm done I forget to ask it back to them (because frequently asking someone about their weekend or kids or whatever is their way of getting to share their own life with you, but my brain doesn't think that way). I get what you're saying. It could absolutely be used for performative interactions but for some of us people drift away because we aren't good at being curious about them or remembering details like that. And also, I have to sit through awkward lunches at work where no one really knows what to talk about or ask about because outside of work we are completely alien to one another. And it's fine. It wouldn't be worth the damage it does. I have left behind all personally identifiable social media for the same reason. But I do hate how social anxiety and ADHD makes friendship so fleeting.
  • 0 Stimmen
    3 Beiträge
    5 Aufrufe
    J
    I deleted the snapchat now.