Skip to content

‘If I switch it off, my girlfriend might think I’m cheating’: inside the rise of couples location sharing

Technology
415 217 161
  • 485 Stimmen
    101 Beiträge
    608 Aufrufe
    dyskolos@lemmy.zipD
    I'd sniff a line of that hopium too. I just don't see it being available in the foreseeable future. At least not for an affordable price
  • 586 Stimmen
    100 Beiträge
    558 Aufrufe
    B
    No, LCOE is an aggregated sum of all the cash flows, with the proper discount rates applied based on when that cash flow happens, complete with the cost of borrowing (that is, interest) and the changes in prices (that is, inflation). The rates charged to the ratepayers (approved by state PUCs) are going to go up over time, with inflation, but the effect of that on the overall economics will also be blunted by the time value of money and the interest paid on the up-front costs in the meantime. When you have to pay up front for the construction of a power plant, you have to pay interest on those borrowed funds for the entire life cycle, so that steadily increasing prices over time is part of the overall cost modeling.
  • 616 Stimmen
    254 Beiträge
    2k Aufrufe
    N
    That’s a very emphatic restatement of your initial claim. I can’t help but notice that, for all the fancy formatting, that wall of text doesn’t contain a single line which actually defines the difference between “learning” and “statistical optimization”. It just repeats the claim that they are different without supporting that claim in any way. Nothing in there, precludes the alternative hypothesis; that human learning is entirely (or almost entirely) an emergent property of “statistical optimization”. Without some definition of what the difference would be we can’t even theorize a test
  • Science and Technology News and Commentary: Aardvark Daily

    Technology technology
    2
    7 Stimmen
    2 Beiträge
    22 Aufrufe
    I
    What are you on about with this? Last news post 2013?
  • Copy Table in Excel and Paste as a Markdown Table

    Technology technology
    2
    1
    23 Stimmen
    2 Beiträge
    24 Aufrufe
    ptz@dubvee.orgP
    That's based on https://github.com/jonmagic/copy-excel-paste-markdown Would be awesome to see some Lemmy clients incorporate that. I've had it requested but haven't had a chance to really dig into it yet.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    88 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • 1 Stimmen
    4 Beiträge
    32 Aufrufe
    N
    that's probably not true. I imagine it was someone trying to harm the guy. a hilarious prank
  • 24 Stimmen
    2 Beiträge
    24 Aufrufe
    toastedravioli@midwest.socialT
    Im all for making the traditional market more efficient and transparent, if blockchain can accommodate that, so long as we can also make crypto more like the traditional market. At least in terms of criminalizing shit that would obviously be illegal to do with securities