Skip to content

Windows 11 will soon be able to describe images on your screen using AI — and it'll all be done locally

Technology
21 19 0
  • Navigating the Skies: Growth and Challenges in the UTM Market

    Technology technology
    1
    1
    1 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • 43 Stimmen
    10 Beiträge
    57 Aufrufe
    D
    Deserved it. Shouldn't have beem a racist xenophobe. Hate speech and incitement of violence is not legally protected in the UK. All those far-right rioters deserves prison.
  • Could Windows and installed apps upload all my personal files?

    Technology technology
    2
    1 Stimmen
    2 Beiträge
    21 Aufrufe
    rikudou@lemmings.worldR
    Yes, every application has access to everything. The only exception are those weird apps that use the universal framework or whatever that thing is called, those need to ask for permissions. But most of the apps on your PC have full access to everything. And Windows does collect and upload a lot of personal information and they could easily upload everything on your system. The same of course applies for the apps as well, they have access to everything except privileged folders (those usually don't contain your personal data, but system files).
  • Role of Email Deliverability Consulting in ROI

    Technology technology
    1
    2
    0 Stimmen
    1 Beiträge
    15 Aufrufe
    Niemand hat geantwortet
  • Google’s test turns search results into an AI-generated podcast

    Technology technology
    4
    1
    5 Stimmen
    4 Beiträge
    27 Aufrufe
    lupusblackfur@lemmy.worldL
    Oh, Google... Just eviler and eviler every day. Not only robbing creators of any monetization via clicking on links but now just blatantly stealing their content for an even more efficient theft model. FFS. I can't fucking wait to complete my de-googling project and get you the absolute fuck completely out of my life. I've developed a hatred for Google that actually rivals my hatred for Apple. ‍️
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    87 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • AI cheating surge pushes schools into chaos

    Technology technology
    25
    45 Stimmen
    25 Beiträge
    121 Aufrufe
    C
    Sorry for the late reply, I had to sit and think on this one for a little bit. I think there are would be a few things going on when it comes to designing a course to teach critical thinking, nuances, and originality; and they each have their own requirements. For critical thinking: The main goal is to provide students with a toolbelt for solving various problems. Then instilling the habit of always asking "does this match the expected outcome? What was I expecting?". So usually courses will be setup so students learn about a tool, practice using the tool, then have a culminating assignment on using all the tools. Ideally, the problems students face at the end require multiple tools to solve. Nuance mainly naturally comes with exposure to the material from a professional - The way a mechanical engineer may describe building a desk will probably differ greatly compared to a fantasy author. You can also explain definitions and industry standards; but thats really dry. So I try to teach nuances via definitions by mixing in the weird nuances as much as possible with jokes. Then for originality; I've realized I dont actually look for an original idea; but something creative. In a classroom setting, you're usually learning new things about a subject so a student's knowledge of that space is usually very limited. Thus, an idea that they've never heard about may be original to them, but common for an industry expert. For teaching originality creativity, I usually provide time to be creative & think, and provide open ended questions as prompts to explore ideas. My courses that require originality usually have it as a part of the culminating assignment at the end where they can apply their knowledge. I'll also add in time where students can come to me with preliminary ideas and I can provide feedback on whether or not it passes the creative threshold. Not all ideas are original, but I sometimes give a bit of slack if its creative enough. The amount of course overhauling to get around AI really depends on the material being taught. For example, in programming - you teach critical thinking by always testing your code, even with parameters that don't make sense. For example: Try to add 123 + "skibbidy", and see what the program does.
  • 0 Stimmen
    4 Beiträge
    25 Aufrufe
    K
    I wish the batteries were modular/interchangeable. You could just pull into a station, remove the spent battery and replace it with a full one, the spent one can then just get recharged and stored at the station for the next user to change out. You could even bring some extra ones in the trunk for a long trip!