Skip to content

Intel faces investor backlash for selling 10% stake to Trump admin at discount

Technology
60 37 0
  • 410 Stimmen
    78 Beiträge
    126 Aufrufe
    C
    My simple rule is that if it uses a neural network model of some kind, then it can be accurately called AI.
  • A letter to the CalyxOS community

    Technology technology
    5
    39 Stimmen
    5 Beiträge
    28 Aufrufe
    01189998819991197253@infosec.pub0
    So they're conducting security audits in line with best practices post executive terminations, and, as a result, security updates are being delayed up to 6 months. They're also recommending alternatives for the time being for those who may be targeted (example being human rights activists). Considering the reasoning, I can't hold this against them. Also considering many are still rocking android 10 or iOS 15, this update pause (hopefully just a pause) isn't the worst thing in the world.
  • 739 Stimmen
    439 Beiträge
    5k Aufrufe
    C
    But are FOSS spirit and asshole users the same thing? On that, I disagree.
  • VMware’s rivals ramp efforts to create alternative stacks

    Technology technology
    10
    1
    77 Stimmen
    10 Beiträge
    139 Aufrufe
    B
    I don't use any GUI... I use terraform in the terminal or via CI/CD. There is an API and also a Terraform provider for Proxmox, and I can use that, together with Ansible and shell scripts to manage VMs, but I was looking for k8s support. Again, it works fine for small environments, with a bit of manual work and human intervention, but for larger ones, I need a bit more. I moved away from a few VMs acting as k8s nodes, to k8s as a service (at work).
  • The Decline of Usability: Revisited | datagubbe.se

    Technology technology
    2
    0 Stimmen
    2 Beiträge
    30 Aufrufe
    2xsaiko@discuss.tchncs.de2
    Just saw this article linked in a ThePrimeagen video. I didn't watch the video, but I did read the article, and all of this article is exactly what I'm always saying when I'm complaining about current UI trends and why I'm so picky about the software I use and also the tools I use to write software. I shouldn't have to be picky, but it seems like developers (professional and hobbyist alike) don't care anymore and users don't have standards.
  • 0 Stimmen
    1 Beiträge
    18 Aufrufe
    Niemand hat geantwortet
  • 45 Stimmen
    7 Beiträge
    80 Aufrufe
    S
    I still get calls, but I can't see details (e.g. just the phone number, not the caller).
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    279 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.