Skip to content

Grok AI to be available in Tesla vehicles next week, Elon Musk says

Technology
63 52 0
  • 13 Stimmen
    3 Beiträge
    20 Aufrufe
    tal@lemmy.todayT
    While details of the Pentagon's plan remain secret, the White House proposal would commit $277 million in funding to kick off a new program called "pLEO SATCOM" or "MILNET." Please do not call it "MILNET". That term's already been taken. https://en.wikipedia.org/wiki/MILNET In computer networking, MILNET (fully Military Network) was the name given to the part of the ARPANET internetwork designated for unclassified United States Department of Defense traffic.[1][2]
  • 47 Stimmen
    13 Beiträge
    74 Aufrufe
    N
    They don't treat their people like shit, they treat them like slaves. In countries outside China at that. https://www.bbc.com/news/articles/c3v5n7w55kpo
  • Secure Your Gmail Now As Google Warns Of Password Attacks

    Technology technology
    9
    1
    53 Stimmen
    9 Beiträge
    42 Aufrufe
    J
    I tried to but they wanted to force me to give them my phone number. Fuck them, they don't need it.
  • Authors petition publishers to curtail their use of AI

    Technology technology
    2
    74 Stimmen
    2 Beiträge
    20 Aufrufe
    M
    I’m sure publishers are all ears /s
  • Firefox 140 Brings Tab Unload, Custom Search & New ESR

    Technology technology
    41
    1
    234 Stimmen
    41 Beiträge
    168 Aufrufe
    S
    Read again. I quoted something along the lines of "just as much a development decision as a marketing one" and I said, it wasn't a development decision, so what's left? Firefox released just as frequently before, just that they didn’t increase the major version that often. This does not appear to be true. Why don't you take a look at the version history instead of some marketing blog post? https://www.mozilla.org/en-US/firefox/releases/ Version 2 had 20 releases within 730 days, averaging one release every 36.5 days. Version 3 had 19 releases within 622 days, averaging 32.7 days per release. But these releases were unscheduled, so they were released when they were done. Now they are on a fixed 90-day schedule, no matter if anything worthwhile was complete or not, plus hotfix releases whenever they are necessary. That's not faster, but instead scheduled, and also they are incrementing the major version even if no major change was included. That's what the blog post was alluding to. In the before times, a major version number increase indicated major changes. Now it doesn't anymore, which means sysadmins still need to consider each release a major release, even if it doesn't contain major changes because it might contain them and the version name doesn't say anything about whether it does or not. It's nothing but a marketing change, moving from "version numbering means something" to "big number go up".
  • 817 Stimmen
    199 Beiträge
    682 Aufrufe
    Z
    It's clear you don't really understand the wider context and how historically hard these tasks have been. I've been doing this for a decade and the fact that these foundational models can be pretrained on unrelated things then jump that generalization gap so easily (within reason) is amazing. You just see the end result of corporate uses in the news, but this technology is used in every aspect of science and life in general (source: I do this for many important applications).
  • 353 Stimmen
    40 Beiträge
    27 Aufrufe
    L
    If AI constantly refined its own output, sure, unless it hits a wall eventually or starts spewing bullshit because of some quirk of training. But I doubt it could learn to summarise better without external input, just like a compiler won't produce a more optimised version of itself without human development work.
  • Napster/BitTorrent for machine learning?

    Technology technology
    3
    1
    27 Stimmen
    3 Beiträge
    25 Aufrufe
    G
    What would a use case look like? I assume that the latency will make it impractical to train something that's LLM-sized. But even for something small, wouldn't a data center be more efficient?