Skip to content

White House unveils sweeping plan to “win” global AI race through deregulation

Technology
58 48 1
  • 46 Stimmen
    1 Beiträge
    14 Aufrufe
    Niemand hat geantwortet
  • 15 Stimmen
    2 Beiträge
    26 Aufrufe
    H
    No article to see here.
  • 1 Stimmen
    1 Beiträge
    12 Aufrufe
    Niemand hat geantwortet
  • The Decline of Usability: Revisited | datagubbe.se

    Technology technology
    8
    67 Stimmen
    8 Beiträge
    43 Aufrufe
    R
    I blame the idea of the 00s and 10s that there should be some "Zen" in computer UIs and that "Zen" is doing things wrong with the arrogant tone of "you don't understand it". Associated with Steve Jobs, but TBH Google as well. And also another idea of "you dummy talking about ergonomics can't be smarter than this big respectable corporation popping out stylish unusable bullshit". So - pretense of wisdom and taste, under which crowd fashion is masked, almost aggressive preference for authority over people actually having maybe some wisdom and taste due to being interested in that, blind trust into whatever tech authority you chose for yourself, because, if you remember, in the 00s it was still perceived as if all people working in anything connected to computers were as cool as aerospace engineers or naval engineers, some kind of elite, including those making user applications, objective flaw (or upside) of the old normal UIs - they are boring, that's why UIs in video games and in fashionable chat applications (like ICQ and Skype), not talking about video and audio players, were non-standard like always, I think the solution would be in per-application theming, not in breaking paradigms, again, like with ICQ and old Skype and video games, I prefer it when boredom is thought with different applications having different icons and colors, but the UI paradigm remains the same, I think there was a themed IE called LOTR browser which I used (ok, not really, I used Opera) to complement ICQ, QuickTime player and BitComet, all mentioned had standard paradigm and non-standard look.
  • 34 Stimmen
    1 Beiträge
    12 Aufrufe
    Niemand hat geantwortet
  • I Counted All of the Yurts in Mongolia Using Machine Learning

    Technology technology
    9
    17 Stimmen
    9 Beiträge
    54 Aufrufe
    G
    I'd say, when there's a policy and its goals aren't reached, that's a policy failure. If people don't like the policy, that's an issue but it's a separate issue. It doesn't seem likely that people prefer living in tents, though. But to be fair, the government may be doing the best it can. It's ranked "Flawed Democracy" by The Economist Democracy Index. That's really good, I'd say, considering the circumstances. They are placed slightly ahead of Argentina and Hungary. OP has this to say: Due to the large number of people moving to urban locations, it has been difficult for the government to build the infrastructure needed for them. The informal settlements that grew from this difficulty are now known as ger districts. There have been many efforts to formalize and develop these areas. The Law on Allocation of Land to Mongolian Citizens for Ownership, passed in 2002, allowed for existing ger district residents to formalize the land they settled, and allowed for others to receive land from the government into the future. Along with the privatization of land, the Mongolian government has been pushing for the development of ger districts into areas with housing blocks connected to utilities. The plan for this was published in 2014 as Ulaanbaatar 2020 Master Plan and Development Approaches for 2030. Although progress has been slow (Choi and Enkhbat 7), they have been making progress in building housing blocks in ger distrcts. Residents of ger districts sell or exchange their plots to developers who then build housing blocks on them. Often this is in exchange for an apartment in the building, and often the value of the apartment is less than the land they originally had (Choi and Enkhbat 15). Based on what I’ve read about the ger districts, they have been around since at least the 1970s, and progress on developing them has been slow. When ineffective policy results in a large chunk of the populace generationally living in yurts on the outskirts of urban areas, it’s clear that there is failure. Choi, Mack Joong, and Urandulguun Enkhbat. “Distributional Effects of Ger Area Redevelopment in Ulaanbaatar, Mongolia.” International Journal of Urban Sciences, vol. 24, no. 1, Jan. 2020, pp. 50–68. DOI.org (Crossref), https://doi.org/10.1080/12265934.2019.1571433.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    88 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • Everyone Is Cheating Their Way Through College

    Technology technology
    23
    1
    170 Stimmen
    23 Beiträge
    115 Aufrufe
    L
    i can this for essay writing, prior to AI people would use prompts and templates of the same exact subject and work from there. and we hear the ODD situation where someone hired another person to do all the writing for them all the way to grad school( this is just as bad as chatgpt) you will get caught in grad school or during your job interview. might be different for specific questions in stem where the answer is more abstract,