Skip to content

Large Language Model Performance Doubles Every 7 Months

Technology
31 23 0
  • 109 Stimmen
    10 Beiträge
    7 Aufrufe
    L
    Yeah, I agree. It's a great starting place. Recently I needed a piece of information that I couldn't find anywhere through a regular search. ChatGPT, Claude and Gemini all gave a similar answers, but it was only confirmed when I contacted the company directly which took about 3 business days to reply.
  • 111 Stimmen
    24 Beiträge
    36 Aufrufe
    O
    Ingesting all the artwork you ever created by obtaining it illegally and feeding it into my plagarism remix machine is theft of your work, because I did not pay for it. Separately, keeping a copy of this work so I can do this repeatedly is also stealing your work. The judge ruled the first was okay but the second was not because the first is "transformative", which sadly means to me that the judge despite best efforts does not understand how a weighted matrix of tokens works and that while they may have some prevention steps in place now, early models showed the tech for what it was as it regurgitated text with only minor differences in word choice here and there. Current models have layers on top to try and prevent this user input, but escaping those safeguards is common, and it's also only masking the fact that the entire model is built off of the theft of other's work.
  • No Internet For 4 Hours And Now This

    Technology technology
    14
    6 Stimmen
    14 Beiträge
    22 Aufrufe
    nokturne213@sopuli.xyzN
    My first set I made myself. The "blackout" backing was white. The curtains themselves were blue with horses I think (I was like 8). I later used the backing with some Star Wars sheets to make new curtains.
  • 118 Stimmen
    34 Beiträge
    54 Aufrufe
    S
    A fairer comparison would be Eliza vs ChatGPT.
  • 880 Stimmen
    356 Beiträge
    136 Aufrufe
    communist@lemmy.frozeninferno.xyzC
    Is that useful for completing tasks?
  • Building a slow web

    Technology technology
    37
    1
    175 Stimmen
    37 Beiträge
    30 Aufrufe
    I
    Realistically, you don't need security, NAT alone is enough since the packets have nowhere to go without port forwarding. But IF you really want to build front end security here is my plan. ISP bridge -> WAN port of openwrt capable router with DSA supported switch (that is almost all of them) Set all ports of the switch to VLAN mirroring mode bridge WAN and LAN sides Fail2Ban IP block list in the bridge LAN PORT 1 toward -> OpenWRT running inside Proxmox LXC (NAT lives here) -> top of rack switch LAN PORT 2 toward -> Snort IDS LAN PORT 3 toward -> combined honeypot and traffic analyzer Port 2&3 detect malicious internet hosts and add them to the block list (and then multiple other openwrt LXCs running many many VPN ports as alternative gateways, I switch LAN host's internet address by changing their default gateway) I run no internal VLAN, all one LAN because convenience is more important than security in my case.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    38 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • YouTube tops Disney and Netflix in TV viewing

    Technology technology
    96
    1
    215 Stimmen
    96 Beiträge
    84 Aufrufe
    C
    "Not Interested" is just free data for them to fill out your account's advertising profile.