Skip to content

Next-Gen Brain Implants Offer New Hope for Depression: AI and real-time neural feedback could transform treatments

Technology
43 28 241
  • I am not depressed, but I will never get a brain implant for any reason. The brain is the final frontier of privacy, it is the one place I am free. If that is taken away I am no longer truly autonomous, I am no longer truly myself.

    I understand this is how older generations feel about lots of things, like smartphones, which I am writing this from, and I understand how stupid it sounds to say "but this is different!", but like... really. This is different. Whatever scale smartphones, drivers licenses, personalized ads, the internet, smart home speakers.... whatever scale all these things lie on in terms of "panopticon-ness", a brain implant is so exponentially further along that scale as to make all the others vanish to nothingness. You can't top a brain implant. A brain implant is a fundamentally unspeakable horror which would inevitably be used to subjugate entire peoples in a way so systematically flawless as to be almost irreversible.

    This is how it starts. First it will be used for undeniable goods, curing depression, psychological ailments, anxiety, and so on. Next thing you know it'll be an optional way to pay your check at restaurants, file your taxes, read a recipe - convenience. Then it will be the main way to do those things, and then suddenly it will be the only way to do those things. And once you have no choice but to use a brain implant to function in society, you'll have no choice but to accept "thought analytics" being reported to your government and corporations. No benefit is worth a brain implant, don't even think about it (but luckily, I can't tell if you do).

    "You can chain me, you can torture me, you can even destroy this body, but you will never imprison my mind."

    -Mahatma Ghandi

  • No no no no no

    AI can't even do a google search right.

    Keep that shit outta my head

    Don't worry, I am sure you will change your mind after you get the implants.

  • This post did not contain any content.

    So, their AI is confidently wrong over 60% of the time, and they thought implanting it into people's brains was a good idea?? Wtf???

  • I am not depressed, but I will never get a brain implant for any reason. The brain is the final frontier of privacy, it is the one place I am free. If that is taken away I am no longer truly autonomous, I am no longer truly myself.

    I understand this is how older generations feel about lots of things, like smartphones, which I am writing this from, and I understand how stupid it sounds to say "but this is different!", but like... really. This is different. Whatever scale smartphones, drivers licenses, personalized ads, the internet, smart home speakers.... whatever scale all these things lie on in terms of "panopticon-ness", a brain implant is so exponentially further along that scale as to make all the others vanish to nothingness. You can't top a brain implant. A brain implant is a fundamentally unspeakable horror which would inevitably be used to subjugate entire peoples in a way so systematically flawless as to be almost irreversible.

    This is how it starts. First it will be used for undeniable goods, curing depression, psychological ailments, anxiety, and so on. Next thing you know it'll be an optional way to pay your check at restaurants, file your taxes, read a recipe - convenience. Then it will be the main way to do those things, and then suddenly it will be the only way to do those things. And once you have no choice but to use a brain implant to function in society, you'll have no choice but to accept "thought analytics" being reported to your government and corporations. No benefit is worth a brain implant, don't even think about it (but luckily, I can't tell if you do).

    "What am I without my legs?" "What am I without my eyes?" "What am I without my arms?"

    What counts as "the real me" has been evolving for decades, if not centuries. I'm not volunteering for brain implants, but I'm not writing off the idea sometime in the future. As for AI, this is going to be more of the ML variety, not the LLM variety. Think more of "neurochemical levels have been trending in a certain direction for too long, release opposing neurochemicals to halt the spiral" and less of a little voice inside your head giving quite possibly incorrect answers to whatever you're thinking of.

    This is absolutely risky stuff, but less risky than recurring electroshock therapy? Hard for me to say. Note that the article is from nearly 2 decades ago, but there are articles in the news from just the last couple weeks.

  • This seems interesting, i'll read it fully after work if i don't forget.

    Something has me convinced i'm depressed but the only time i ever had the posibility to look for help they sort of just worked me towards the door and cut me off asap.

    But they ended up giving me some sort of anti psychotic medication, which definitely allowed me to get back on my feet at the time. (Shit was dark, i fell in a hole with covid, homelessness and unemployment alltogether with my wife and reached a point where i struggled so much i couldn't even get my ass to a job interview).

    But i still don't know what the cause of my struggles is, only that they've been around as long as i can remember. Some form of psychotic whatever wouldn't surprise me either looking at my mom and what she did. But from what i know (which isn't a lot obviously) it seems more like depression.

    I likely had undiagnosed depression for decades before I got treatment, from a GP, no less, after being dismissed by a psychiatrist. If you have concerns about your health, keep trying to get help, as long as you're able.

  • "What am I without my legs?" "What am I without my eyes?" "What am I without my arms?"

    What counts as "the real me" has been evolving for decades, if not centuries. I'm not volunteering for brain implants, but I'm not writing off the idea sometime in the future. As for AI, this is going to be more of the ML variety, not the LLM variety. Think more of "neurochemical levels have been trending in a certain direction for too long, release opposing neurochemicals to halt the spiral" and less of a little voice inside your head giving quite possibly incorrect answers to whatever you're thinking of.

    This is absolutely risky stuff, but less risky than recurring electroshock therapy? Hard for me to say. Note that the article is from nearly 2 decades ago, but there are articles in the news from just the last couple weeks.

    Those are some good nuances that definitely require a nuanced response and forced me to refine my thinking, thank you! I'm actually not claiming that the brain is the sole boundary of the real me, rather it is the majority of me, but my body is a contributor. The real me does change as my body changes, just in less meaningful ways. Likewise some changes in the brain change the real me more than others. However, regardless of what constitutes the real me or not, (and believe me, the philosophical rabbit hole there is one I love to explore), in this case I'm really just talking about the straightforward immediate implications of a brain implant on my privacy. An arm implant would also be quite bad in this regard, but a brain implant is clearly worse.

    There have already been systems that can display very rough, garbled images of what people are thinking of. I'm less worried about an implant that tells me what to do or controls me directly, and more worried about an implant that has a pretty accurate picture of my thoughts and reports it to authorities. It's surely possible to build a system that can approximate positive or negative mood states, and in combination this is very dangerous. If the government can tell that I'm happy when I think about Luigi Mangione, then they can respond to that information however they want. Eventually, in the same way that I am conditioned by the panopticon to stop at stop sign, even in the middle of a desolate desert where I can see for miles around that there are no cars, no police, no cameras - no anything that could possibly make a difference to me running the stop sign - the system will similarly condition automatic compliance in thoughts themselves. That is, compliance is brought about not by any actual exertion of power or force, but merely by the omnipresent possibility of its exertion.

    (For this we only need moderately complex brain implants, not sophisticated ones that actually control us physiologically.)

  • Those are some good nuances that definitely require a nuanced response and forced me to refine my thinking, thank you! I'm actually not claiming that the brain is the sole boundary of the real me, rather it is the majority of me, but my body is a contributor. The real me does change as my body changes, just in less meaningful ways. Likewise some changes in the brain change the real me more than others. However, regardless of what constitutes the real me or not, (and believe me, the philosophical rabbit hole there is one I love to explore), in this case I'm really just talking about the straightforward immediate implications of a brain implant on my privacy. An arm implant would also be quite bad in this regard, but a brain implant is clearly worse.

    There have already been systems that can display very rough, garbled images of what people are thinking of. I'm less worried about an implant that tells me what to do or controls me directly, and more worried about an implant that has a pretty accurate picture of my thoughts and reports it to authorities. It's surely possible to build a system that can approximate positive or negative mood states, and in combination this is very dangerous. If the government can tell that I'm happy when I think about Luigi Mangione, then they can respond to that information however they want. Eventually, in the same way that I am conditioned by the panopticon to stop at stop sign, even in the middle of a desolate desert where I can see for miles around that there are no cars, no police, no cameras - no anything that could possibly make a difference to me running the stop sign - the system will similarly condition automatic compliance in thoughts themselves. That is, compliance is brought about not by any actual exertion of power or force, but merely by the omnipresent possibility of its exertion.

    (For this we only need moderately complex brain implants, not sophisticated ones that actually control us physiologically.)

    I absolutely think that privacy within your own mind should be inviolable (trusting corporations and even government to agree is laughable). Iain Banks' Culture series explores some of these implications, as well as who should be in control of your mental state. It's messy and hard, and is one of the reasons I currently wouldn't get a brain implant. I might change my mind if I had ALS, for instance.

  • So, their AI is confidently wrong over 60% of the time, and they thought implanting it into people's brains was a good idea?? Wtf???

    All LLMs are AI, all AI is not an LLM.

  • Tell me you don’t understand what depression is without telling me you don’t understand what depression is. You can be depressed while living for free on a beach with no responsibilities. To suggest you can fix everyone’s depression with external changes is the height of “just go outside and you’ll feel better.”

  • Tell me you don’t understand what depression is without telling me you don’t understand what depression is. You can be depressed while living for free on a beach with no responsibilities. To suggest you can fix everyone’s depression with external changes is the height of “just go outside and you’ll feel better.”

    As someone who is literally living where others go on holidays while depressed let me tell you my depression is very much a response to the world being a rotting shithole. I am not sad because my life sucks but because so many others are suffering and I feel powerless to change it. The narrative of 'chemical imbalance' is a very reductive and misleading one.

    The feeling of powerlessness and disconnect also points to the cure I find for myself. Instead of implanting experimental BS inventions into my brain I try to be a force of connection, community and hope for others. There is very few things I can do as a single tiny person, but in these very small things lies the power of change for the better.

  • Depression replaced with horror?

    I'll take it.

    Horror might be more entertaining than depression, but the sheer idea of letting some techbro implant shit in my brain is so ridiculous, I'd probably try DIY lobotomy before I consider the AI option.

  • Chronic depression since a traumatic event trigger in 1989 here. They can shove those chips up their own arse.

    Depressed ever since puberty when I realised that Hollywood isn't real life.

  • Horror might be more entertaining than depression, but the sheer idea of letting some techbro implant shit in my brain is so ridiculous, I'd probably try DIY lobotomy before I consider the AI option.

    Lobotomy is a bit extreme. Try Trepanning first.

    Trepanning was sometimes performed on people who were behaving in a manner that was considered abnormal. In some ancient societies it was believed this released the evil spirits that were to blame.

  • Imagine witholding a medicine from a sick person, telling them it's the world that's broken. That's some Mother Teresa level evil.

    Wanting full control over the sick is also some Mother Teresa evil shit. It’s bad enough that life-saving medication is gate-kept by patents and pharmaceutical companies thriving on suffering (oh look, there she is again) but now people that suffer should give up access to what makes them them, their entire personhood, to some tech-bro ingrate? Is that truly the best option?

    I’d rather die.

  • This post did not contain any content.

    Nope I'd definitely kill myself before letting an ai fuck with my brain

  • Nope I'd definitely kill myself before letting an ai fuck with my brain

    In a sense, AI is already fucking with everyone's brain when it comes to mass-produced ads and propaganda.

  • This post did not contain any content.

    This sounds like an absolute nightmare. Listen to the techbro leaders talk about the general population, and imagine them owning what manages your brain... Ieam electrically, as opposed to indirectly.

  • I likely had undiagnosed depression for decades before I got treatment, from a GP, no less, after being dismissed by a psychiatrist. If you have concerns about your health, keep trying to get help, as long as you're able.

    It's been something i've thought about a lot, but at the moment it feels manageable to the point other things get priority.

  • As someone who is literally living where others go on holidays while depressed let me tell you my depression is very much a response to the world being a rotting shithole. I am not sad because my life sucks but because so many others are suffering and I feel powerless to change it. The narrative of 'chemical imbalance' is a very reductive and misleading one.

    The feeling of powerlessness and disconnect also points to the cure I find for myself. Instead of implanting experimental BS inventions into my brain I try to be a force of connection, community and hope for others. There is very few things I can do as a single tiny person, but in these very small things lies the power of change for the better.

    To be clear, I’m not claiming all depression can’t be influenced from external factors. And any BCI claiming to solve all depression should be treated like the plague. But there are types of depression that are solely a chemical imbalance that cannot be corrected through external means. And yeah, the “all depression is just a chemical imbalance” narrative is horribly flawed. But to claim none of them are a chemical imbalance is just as bad.

    Our current treatment for such types of depression are essentially still in the stone ages. Throw something at it, see what happens, adjust as needed. If a BCI can work around such a situation and offer a direct and targeted approach to the issue, and it goes through extensive testing, I don’t see why this wouldn’t be a good potential solution.

  • To be clear, I’m not claiming all depression can’t be influenced from external factors. And any BCI claiming to solve all depression should be treated like the plague. But there are types of depression that are solely a chemical imbalance that cannot be corrected through external means. And yeah, the “all depression is just a chemical imbalance” narrative is horribly flawed. But to claim none of them are a chemical imbalance is just as bad.

    Our current treatment for such types of depression are essentially still in the stone ages. Throw something at it, see what happens, adjust as needed. If a BCI can work around such a situation and offer a direct and targeted approach to the issue, and it goes through extensive testing, I don’t see why this wouldn’t be a good potential solution.

    Our current treatment for such types of depression are essentially still in the stone ages. Throw something at it, see what happens, adjust as needed.

    I know, and I guess watching a loved one being slowly destroyed by the trial and error that is 'modern' medication made me want to never consider it no matter how bad i felt - so this AI thing seems to be an even more dangerous trial and error method, because it seems even more invasive and less tested than the medication that's available now. On the other hand I've found self medication with plant medicine (yes, it's weed, weed, and more weed, but also quite a few other herbs I collect myself) quite efficient and safe. I've managed to keep myself going for a few bad years and have now reached the point where I went off it cold turkey - something my loved one never managed to do once he was hooked onto the meds. All done on my own terms, no doctor pretending they know better than me, giving myself the time I needed. So that's for a true stone age method, and given the fact our bodies are still working the same way as they did in the stone age I feel it might be safer than any novelty they have come up with in the last decades. Probably that's a controversial take on this, and I don't expect this to work for everybody (you need to have lots of time to be able to afford to rest and relax and have access to unlimited amounts of plant medicine).

  • How to Choose Between Flats in Gunnersbury and Wembley Park

    Technology technology
    1
    0 Stimmen
    1 Beiträge
    14 Aufrufe
    Niemand hat geantwortet
  • 89 Stimmen
    15 Beiträge
    70 Aufrufe
    S
    I suspect people (not billionaires) are realising that they can get by with less. And that the planet needs that too. And that working 40+ hours a week isn’t giving people what they really want either. Tbh, I don't think that's the case. If you look at any of the relevant metrics (CO², energy consumption, plastic waste, ...) they only know one direction globally and that's up. I think the actual issues are Russian invasion of Ukraine and associated sanctions on one of the main energy providers of Europe Trump's "trade wars" which make global supply lines unreliable and costs incalculable (global supply chains love nothing more than uncertainty) Uncertainty in regards to China/Taiwan Boomers retiring in western countries, which for the first time since pretty much ever means that the work force is shrinking instead of growing. Economical growth was mostly driven by population growth for the last half century with per-capita productivity staying very close to inflation. Disrupting changes in key industries like cars and energy. The west has been sleeping on may of these developments (e.g. electric cars, batteries, solar) and now China is curbstomping the rest of the world in regards to market share. High key interest rates (which are applied to reduce high inflation due to some of the reason above) reduce demand on financial investments into companies. The low interest rates of the 2010s and also before lead to more investments into companies. With interest going back up, investments dry up. All these changes mean that companies, countries and people in the west have much less free cash available. There’s also the value of money has never been lower either. That's been the case since every. Inflation has always been a thing and with that the value of money is monotonically decreasing. But that doesn't really matter for the whole argument, since the absolute value of money doesn't matter, only the relative value. To put it differently: If you earn €100 and the thing you want to buy costs €10, that is equivalent to if you earn €1000 and the thing you want to buy costing €100. The value of money dropping is only relevant for savings, and if people are saving too much then the economy slows down and jobs are cut, thus some inflation is positive or even required. What is an actual issue is that wages are not increasing at the same rate as the cost of things, but that's not a "value of the money" issue.
  • Trump social media site brought down by Iran hackers

    Technology technology
    174
    1k Stimmen
    174 Beiträge
    766 Aufrufe
    B
    That's the spirit
  • Lawmakers Demand Palantir Provide Information About U.S. Contracts

    Technology technology
    2
    119 Stimmen
    2 Beiträge
    22 Aufrufe
    C
    Sauron Denies Request for Contract Information Reading a prepared statement from the tower of Barad-dûr, the Mouth of Sauron indicated today that the Dark Lord would not be complying with the demands of lawmakers to provide information on its contracts with the Trump Administration. The Messenger of Mordor further called the demands "ridiculous" and "unnecessary government intrusion into private affairs of Sauron, who does not answer to any higher authority, save that of his fallen master Morgoth." Furthermore, the statement chastised the lawmakers for contacting Sauron through the Palantir, which he described as "an illegal privacy breach," and said he planned to seek legal action for this invasion of his personal communications.
  • No, Social Media is Not Porn

    Technology technology
    3
    1
    21 Stimmen
    3 Beiträge
    28 Aufrufe
    Z
    This feels dystopian and like overreach. But that said, there definitely is some porn on the 4 platforms they cited. It's an excuse sure, but let's also not deny reality.
  • Is Internet Content Too Engaging?

    Technology technology
    3
    4 Stimmen
    3 Beiträge
    27 Aufrufe
    T
    The number of tabs I have open from sites I’ve clicked on, started reading, said “eh, I’ll get back to this later” and never have, says no.
  • Microsoft Tests Removing Its Name From Bing Search Box

    Technology technology
    11
    1
    52 Stimmen
    11 Beiträge
    58 Aufrufe
    alphapuggle@programming.devA
    Worse. Office.com now takes me to m365.cloud.microsoft which as of today now takes me to a fucking Copilot chat window. Ofc no way to disable it because gee why would anyone want to do that?
  • OpenAI plans massive UAE data center project

    Technology technology
    4
    1
    0 Stimmen
    4 Beiträge
    31 Aufrufe
    V
    TD Cowen (which is basically the US arm of one of the largest Canadian investment banks) did an extensive report on the state of AI investment. What they found was that despite all their big claims about the future of AI, Microsoft were quietly allowing letters of intent for billions of dollars worth of new compute capacity to expire. Basically, scrapping future plans for expansion, but in a way that's not showy and doesn't require any kind of big announcement. The equivalent of promising to be at the party and then just not showing up. Not long after this reporting came out, it got confirmed by Microsoft, and not long after it came out that Amazon was doing the same thing. Ed Zitron has a really good write up on it; https://www.wheresyoured.at/power-cut/ Amazon isn't the big surprise, they've always been the most cautious of the big players on the whole AI thing. Microsoft on the other hand are very much trying to play things both ways. They know AI is fucked, which is why they're scaling back, but they've also invested a lot of money into their OpenAI partnership so now they have to justify that expenditure which means convincing investors that consumers absolutely love their AI products and are desparate for more. As always, follow the money. Stuff like the three mile island thing is mostly just applying for permits and so on at this point. Relatively small investments. As soon as it comes to big money hitting the table, they're pulling back. That's how you know how they really feel.