Skip to content

Adblockers stop publishers serving ads to (or even seeing) 1bn web users - Press Gazette

Technology
322 195 57
  • 738 Stimmen
    67 Beiträge
    354 Aufrufe
    K
    That has always been the two big problems with AI. Biases in the training, intentional or not, will always bias the output. And AI is incapable of saying "I do not have suffient training on this subject or reliable sources for it to give you a confident answer". It will always give you its best guess, even if it is completely hallucinating much of the data. The only way to identify the hallucinations if it isn't just saying absurd stuff on the face of it, it to do independent research to verify it, at which point you may as well have just researched it yourself in the first place. AI is a tool, and it can be a very powerful tool with the right training and use cases. For example, I use it at a software engineer to help me parse error codes when googling working or to give me code examples for modules I've never used. There is no small number of times it has been completely wrong, but in my particular use case, that is pretty easy to confirm very quickly. The code either works as expected or it doesn't, and code is always tested before releasing it anyway. In research, it is great at helping you find a relevant source for your research across the internet or in a specific database. It is usually very good at summarizing a source for you to get a quick idea about it before diving into dozens of pages. It CAN be good at helping you write your own papers in a LIMITED capacity, such as cleaning up your writing in your writing to make it clearer, correctly formatting your bibliography (with actual sources you provide or at least verify), etc. But you have to remember that it doesn't "know" anything at all. It isn't sentient, intelligent, thoughtful, or any other personification placed on AI. None of the information it gives you is trustworthy without verification. It can and will fabricate entire studies that do not exist even while attributed to real researcher. It can mix in unreliable information with reliable information becuase there is no difference to it. Put simply, it is not a reliable source of information... ever. Make sure you understand that.
  • 70 Stimmen
    5 Beiträge
    46 Aufrufe
    rimu@piefed.socialR
    Yep It is a design choice to offer a news feed that combines verified news sources with tankie memes — interspersed with photos generated by AI I've really tried to provide tools to tame the meme flood and put them into effect on https://PieFed.social - compare that with the front-page (or All feed) of any Lemmy instance (or most PieFed instances, to be fair). Gen AI filter is coming.
  • This Is Why Tesla’s Robotaxi Launch Needed Human Babysitters

    Technology technology
    26
    1
    114 Stimmen
    26 Beiträge
    129 Aufrufe
    H
    Karel es hone
  • 61 Stimmen
    11 Beiträge
    59 Aufrufe
    K
    If you use LLMs like they should be, i.e. as autocomplete, they're helpful. Classic autocomplete can't see me type "import" and correctly guess that I want to import a file that I just created, but Copilot can. You shouldn't expect it to understand code, but it can type more quickly than you and plug the right things in more often than not.
  • 202 Stimmen
    15 Beiträge
    79 Aufrufe
    A
    If you are taking business advice from ChatGPT that includes purchasing a ChatGPT subscription or can’t be bothered to look up how it works beforehand then your business is probably going to fail.
  • 11 Stimmen
    19 Beiträge
    69 Aufrufe
    E
    No, just laminated ones. Closed at one end. Easy enough to make or buy. You can even improvise the propellant.
  • The mystery of $MELANIA

    Technology technology
    13
    1
    25 Stimmen
    13 Beiträge
    72 Aufrufe
    geekwithsoul@lemm.eeG
    Archive
  • 298 Stimmen
    9 Beiträge
    53 Aufrufe
    kolanaki@pawb.socialK
    Internet access should be a utility like electricity and water until all three, along with housing, medicine, and food, can be free to all.