Skip to content

Reddit users in the UK must now upload selfies to access NSFW subreddits

Technology
274 182 7.1k
  • u/spez was the lead moderator of r/jailbait, and when he was caught, he got rid of mod transparency. Ghilisaine Maxwell was likely a l lead moderator of news Reddits as well (u/MaxwellHill). Reddit has always been compromised.

    I’m not defending Spez, I think he’s a piece of shit and he did edit other users’ comments that were critical of him, which is fucked up, but I don’t think he was actually involved with that sub. It was possible to appoint mods without their knowledge or consent, and he’s a huge target, someone must have done it as a joke.

  • Hm, I'm going to need some software engineers to critique an idea I have that could at least partially solve the fears people have about their personal details being tied to their porn habits.

    The system will be called the Adult Content Verification System (or Wank Card if you want to be funny). It's a physical card, printed by the government with a unique key printed on it. Those cards are then sold by any shop that has an alcohol license (premises or personal). You go in, show your ID to the clerk, buy the card. That card is proof that you're over 18, but it is not directly tied to you, you just have to be over 18 to buy it. The punishment for selling a Wank Card to someone under the age of 18 is the same as if you sold alcohol to someone under 18.

    When you go to the porn site, they check if you're from the UK, they check if you have a key associated with your account. If not, they ask for one, you provide the key to the site, the site does an API call to https://wankcard.gov.uk/api/verify with the site's API key (freely generated, but you could even make the api public if you want) and the key on the card, gets a response saying "Yep! This is a valid key!" and hey presto, free to wank and nobody knows it's you! If you don't have an account, the verification would have to be tied to a cookie or something that disappears after a while for all you anonymous people.

    As a result, you can both prove that you're over 18 (because you have the card) and some company over in San Francisco doesn't get your personal data, because you never actually record it anywhere. All you have is keys, and while yes, the government could record "Oh this key was used to verify on this site", they'd have to know which shop the key was bought from, who sold it, and who bought it, which is a lot more difficult to do unless the shopkeeper keeps records of everyone he's ever sold to.

    So... Good idea? Bad idea? Better than the current approach anyway, I think.

    This would be better than most of the crap being proposed or implemented.

    But, since the keys are presumably reusable, they'll presumably get borrowed shared by and among minors almost immediately.

    There could be some "Netflix account sharing" style work to deter that, of course.

  • The solution to all of this “think of the children” stuff is that devices owned/used by children should have to be registered as a child’s device, which would enable certain content blockers.

    Forcing adults to verify their identity, rather than simply activating some broad based restrictions on devices being purchased for child use, is a waste of time. Kids will still find workarounds. Adult privacy will be compromised.

    Its also an easily enforceable policy to require registration of children’s devices. You can hold the parents to compliance. You can hold the carriers to compliance. Its truly the simplest way to keep kids from accessing porn without having to mess with adult use of the internet whatsoever

    Your solution is worse.

    As is, it is the responsibility of the content provider to make sure that they are distributing only to people who are legally allowed to have it.

    With age-verification the user has to prove that they are allowed to access the content, then the site can distribute it to them.

    Your approach is to distribute the content by default and only deny it to ChildDevices. In order for this to work at all, you have to mandate that children can only use ChildDevices. This is soooo much worse than simply requiring that adults who want to see certain content have to prove that they can legally access it. If adults have reservations about providing ID for pornography, the loss of such content seems to be much less than denying children Internet access. (Although, I'm sure that Lemmings would disagree for obvious reasons).

  • This would be better than most of the crap being proposed or implemented.

    But, since the keys are presumably reusable, they'll presumably get borrowed shared by and among minors almost immediately.

    There could be some "Netflix account sharing" style work to deter that, of course.

    Yeah I did consider that people are going to share keys, but people are going to share accounts too so that's always going to happen. The best thing you can do is stick some safeguards on the keys where if a key is found online, it can be deactivated and potentially investigated since you can tell which shop sold the key. If there's a shop out there just giving cards away to minors, well they're in for a world of trouble.

    Under the Licensing Act of 2003, it's illegal to sell alcohol to an adult if you reasonably suspect that they will be then giving that alcohol to a minor. You can assume the same will apply to people selling Wank Cards.

  • It'll almost certainly be an AI model doing it.

    It'll almost certainly be an AI model backed by 1000s of "trainers" in 3rd world countries doing it, but only until the model is fully trained.

  • I keep thinking about some of RPs I've done in my life. Hot, vile, smutty text based RPs. I think about them and wonder if there will ever be a time when those words would be considered illegal and I would be arrested for posting them. This doesn't just protect minors. It tags deviance. Some of you may know the darker corners of Reddit. Imagine if an AI flagged your subs. The delete-rebuild cycle doesn't work anymore. Reddit will always know. If the law asks for suspects for newly illegal thought crime, Reddit will be able to point to all the users on those dark corners. We are moving into a future where privacy doesn't matter and I fear what that means for the kinky among us.

    I'm nowhere near as worried about this for kink stuff as I am about us LGBTQ living in the US.

  • This is what Facebook does to verify accounts, they also autoban if you try to register with a temp email

    yep i remember seeing that, fuck reddit and facebook

  • Finally it seems the end of Reddit is near.

    That said, as someone who has posted stuff like that and had it spread without my consent, screw (very much not literally) consuming that shit without taking the same risks as the people sharing what they get off to.

    I do think its gross to require it for the other NSFW stuff. Drug forums are very important resources for harm reduction.

  • If the UK is going to require adult verification it should be built into your internet contract. Yeah, I'm an adult. I'm paying my bills, of course I'm a fucking adult. I over pay for this garbage internet.

    Uploading a selfie? The ai is going to determine if you're over 18? Can the ai determine if the selfie is also ai?

    Just send an AI selfie problem solved.

  • I'll never forget how he changed users' text without them knowing it before the 2016 election. Reddit was going downhill before, but that was a turning point.

    For those unaware, this isn't something like replacing a slur with removed, he edited users' comments, turning them into insults to other users.

    I don't care that those original commenters were (likely) pieces of shit, and the people who he made the comments insult were definitely pieces of shit, putting words into people's mouths to make them fight each other is unforgivable. Even if you put out a shitty apology.

  • Problem is, how do we know that the company is reputable, audited, and so on?

    I’ve seen more places requiring verification - and each one of them seems to use a different verification company. How are there so many of these places, and why aren’t they more commonly known? Like Experian for credit, etc.

    Sure it might sound good to keep them separate - but all that is doing is absolving the content host from liabilities for providing the adult content (somewhere) on their platforms and sites. Reddit don’t want to get involved, and I’ll bet they found the cheapest and easiest provider, or the first one in the search list and thought “good enough”.

    I think it's good that Reddit is trying to continue to allow adult content within the legal framework in which it must operate.

    I guess what I'm not clear on it is what the legal framework is for verification services. Absent rules that require robust privacy protections market forces will push a race to the bottom in terms of cost and data security will be the first to take a hit.

    I know this might seem weird but I think this is one of those cases where a blockchain based smart contract might be the best solution. I'm not exactly sure, as any system that allows one to consume content generally also allows one to copy it, but having a system defined in code in a publicly auditable manner that cannot be changed without notice seems to me to have the capacity to grant the most reassurance.

    I mean I assume that all the verification company is doing now is verifying a person's age and then giving a kind of authorization token that's cryptographically secure that basically says "the owner of this cryptographic key is of age".

  • Hm, I'm going to need some software engineers to critique an idea I have that could at least partially solve the fears people have about their personal details being tied to their porn habits.

    The system will be called the Adult Content Verification System (or Wank Card if you want to be funny). It's a physical card, printed by the government with a unique key printed on it. Those cards are then sold by any shop that has an alcohol license (premises or personal). You go in, show your ID to the clerk, buy the card. That card is proof that you're over 18, but it is not directly tied to you, you just have to be over 18 to buy it. The punishment for selling a Wank Card to someone under the age of 18 is the same as if you sold alcohol to someone under 18.

    When you go to the porn site, they check if you're from the UK, they check if you have a key associated with your account. If not, they ask for one, you provide the key to the site, the site does an API call to https://wankcard.gov.uk/api/verify with the site's API key (freely generated, but you could even make the api public if you want) and the key on the card, gets a response saying "Yep! This is a valid key!" and hey presto, free to wank and nobody knows it's you! If you don't have an account, the verification would have to be tied to a cookie or something that disappears after a while for all you anonymous people.

    As a result, you can both prove that you're over 18 (because you have the card) and some company over in San Francisco doesn't get your personal data, because you never actually record it anywhere. All you have is keys, and while yes, the government could record "Oh this key was used to verify on this site", they'd have to know which shop the key was bought from, who sold it, and who bought it, which is a lot more difficult to do unless the shopkeeper keeps records of everyone he's ever sold to.

    So... Good idea? Bad idea? Better than the current approach anyway, I think.

    I'm a security dev and this is a good idea!

  • So, the UK sucks.

    Something similar is coming to Australia as well.

  • Google uses reddit for its AI training. Just saying.

    God help us all.

  • Finally it seems the end of Reddit is near.

    Years later, you will find many then teen’s 80yo grandmas’ photos in the leaked database

  • Yeah I did consider that people are going to share keys, but people are going to share accounts too so that's always going to happen. The best thing you can do is stick some safeguards on the keys where if a key is found online, it can be deactivated and potentially investigated since you can tell which shop sold the key. If there's a shop out there just giving cards away to minors, well they're in for a world of trouble.

    Under the Licensing Act of 2003, it's illegal to sell alcohol to an adult if you reasonably suspect that they will be then giving that alcohol to a minor. You can assume the same will apply to people selling Wank Cards.

    people are going to share keys,

    get ahead of it and sell discounted bukkakeys

    you could probably even have a bundle called the "family plan" for the real sickos

    I should get into masturbation regulation marketing!

    Hungry for Adams Apples?
    Try our limp biscuits!

  • Hm, I'm going to need some software engineers to critique an idea I have that could at least partially solve the fears people have about their personal details being tied to their porn habits.

    The system will be called the Adult Content Verification System (or Wank Card if you want to be funny). It's a physical card, printed by the government with a unique key printed on it. Those cards are then sold by any shop that has an alcohol license (premises or personal). You go in, show your ID to the clerk, buy the card. That card is proof that you're over 18, but it is not directly tied to you, you just have to be over 18 to buy it. The punishment for selling a Wank Card to someone under the age of 18 is the same as if you sold alcohol to someone under 18.

    When you go to the porn site, they check if you're from the UK, they check if you have a key associated with your account. If not, they ask for one, you provide the key to the site, the site does an API call to https://wankcard.gov.uk/api/verify with the site's API key (freely generated, but you could even make the api public if you want) and the key on the card, gets a response saying "Yep! This is a valid key!" and hey presto, free to wank and nobody knows it's you! If you don't have an account, the verification would have to be tied to a cookie or something that disappears after a while for all you anonymous people.

    As a result, you can both prove that you're over 18 (because you have the card) and some company over in San Francisco doesn't get your personal data, because you never actually record it anywhere. All you have is keys, and while yes, the government could record "Oh this key was used to verify on this site", they'd have to know which shop the key was bought from, who sold it, and who bought it, which is a lot more difficult to do unless the shopkeeper keeps records of everyone he's ever sold to.

    So... Good idea? Bad idea? Better than the current approach anyway, I think.

    How would you solve replay attacks? Like a million people, of age or not, sharing the same key?

  • I keep thinking about some of RPs I've done in my life. Hot, vile, smutty text based RPs. I think about them and wonder if there will ever be a time when those words would be considered illegal and I would be arrested for posting them. This doesn't just protect minors. It tags deviance. Some of you may know the darker corners of Reddit. Imagine if an AI flagged your subs. The delete-rebuild cycle doesn't work anymore. Reddit will always know. If the law asks for suspects for newly illegal thought crime, Reddit will be able to point to all the users on those dark corners. We are moving into a future where privacy doesn't matter and I fear what that means for the kinky among us.

    That's a subject many never talk about: it assumes we (1) have morality all figured out and (2) it's the same for everyone, everywhere.

  • How would you solve replay attacks? Like a million people, of age or not, sharing the same key?

    Maybe you could limit the number of verifications a key can have in a day? Limit it to say 10 verifications per day. So if you're on Pornhub and have an account, you can have the key associated with the account, verified, and so you don't need to re-verify. But if you go on 10 completely different sites and verify for each one, you can't verify after that 10th one within the same 24hr period?

    You could maybe also include guidelines for integration where if a key is associated with an account, that key can't be used for any other account. You can include that under some requirement that says you have to make 'best efforts' to ensure that a key is only ever used by one account at a time. That way, if a million people are sharing the same key, you'd have to trust that all one million of them will never associate that key with their account because if they do, it invalidates that key for every use other than through that account on that site.

  • For those unaware, this isn't something like replacing a slur with removed, he edited users' comments, turning them into insults to other users.

    I don't care that those original commenters were (likely) pieces of shit, and the people who he made the comments insult were definitely pieces of shit, putting words into people's mouths to make them fight each other is unforgivable. Even if you put out a shitty apology.

    Not only was the apology horrible, but for any user on that platform for YEARS: obviously puts the thought in their head that spez could be changing their words by directly editing the db, and getting them put on a list for wrong-speak. Sure, that's possible with any DB, but he proved it was actually something being done on that site. Given his role, a major red flag, as this type of action would normally result in someone being fired.

    Reddit has since IPOd and is going to probably do well as a stock because of all the information it harvests from users.

  • 721 Stimmen
    67 Beiträge
    234 Aufrufe
    S
    All the research I am aware of - including what I referenced in the previous comment, is that people are honest by default, except for a few people who lie a lot. Boris Johnson is a serial liar and clearly falls into that camp. I believe that you believe that, but a couple of surveys are not a sufficient argument to prove the fundamental good of all humanity. If honesty were not the default, why would we believe what anyone has to say in situations where they have an incentive to lie, which is often? Why are such a small proportion of people criminals and fraudsters when for a lot of crimes, someone smart and cautious has a very low chance of being caught? I think this is just a lack of imagination. i will go through your scenarios and provide an answer but i don't think it's going to achieve anything, we just fundamentally disagree on this. why would we believe what anyone has to say in situations where they have an incentive to lie, which is often? You shouldn't. edit : You use experience with this person or in general, to make a judgement call about whether or not you want to listen to what they have to say until more data is available. You continue to refine based on accumulated experience. Why are such a small proportion of people criminals and fraudsters when for a lot of crimes, someone smart and cautious has a very low chance of being caught? A lot of assumptions and leaps here. Firstly crime implies actual law, which is different in different places, so let's assume for now we are talking about the current laws in the uk. Criminals implies someone who has been caught and prosecuted for breaking a law, I'm going with that assumption because "everyone who has ever broken a law" is a ridiculous interpretation. So to encompass the assumptions: Why are such a small proportion of people who have been caught and prosecuted for breaking the law in the uk, when someone smart and caution has a very low chance of being caught? I hope you can see how nonsensical that question is. The evolutionary argument goes like this: social animals have selection pressure for traits that help the social group, because the social group contains related individuals, as well as carrying memetically inheritable behaviours. This means that the most successful groups are the ones that work well together. A group first of all has an incentive to punish individuals who act selfishly to harm the group - this will mean the group contains mostly individuals who, through self interest, will not betray the group. But a group which doesn’t have to spend energy finding and punishing traitorous individuals because it doesn’t contain as many in the first place will do even better. This creates a selection pressure behind mere self interest. That's a nicely worded very bias interpretation. social animals have selection pressure for traits that help the social group, because the social group contains related individuals, as well as carrying memetically inheritable behaviours. This is fine. This means that the most successful groups are the ones that work well together. That's a jump, working well together might not be the desirable trait in this instance. But let's assume it is for now. A group first of all has an incentive to punish individuals who act selfishly to harm the group - this will mean the group contains mostly individuals who, through self interest, will not betray the group. Reductive and assumptive, you're also conflating selfishness with betrayal, you can have on without the other, depending on perceived definitions of course. But a group which doesn’t have to spend energy finding and punishing traitorous individuals because it doesn’t contain as many in the first place will do even better. This creates a selection pressure behind mere self interest. Additional reduction and a further unsupported jump, individuals are more than just a single trait, selfishness might be desirable in certain scenarios or it might be a part of an individual who's other traits make up for it in a tribal context. The process of seeking and the focused attention might be a preferential selection trait that benefits the group. Powerful grifters try to protect themselves yes, but who got punished for pointing out that Boris is a serial liar? Everyone who has been negatively impacted by the policies enacted and consequences of everything that was achieved on the back of those lies. Because being ignored is still a punishment if there are negative consequences. But let's pick a more active punishment, protesting. Protest in a way we don't like or about a subject we don't approve of, it's now illegal to protest unless we give permission. That's reductive, but indicative of what happened in broad strokes. Have you read what the current government has said about the previous one? I'd imagine something along the lines of what the previous government said about the one before ? As a society we generally hate that kind of behaviour. Society as a whole does not protect wealth and power; wealth and power forms its own group which tries to protect itself. Depends on how you define society as a whole. By population, i agree. By actual power to enact change(without extreme measures), less so Convenient that you don't include the wealth and power as part of society, like its some other separate thing. You should care because it entirely colours how you interact with political life. “Shady behaviour” is about intent as well as outcome, and we are talking in this thread about shady behaviour, and hence about intent. See [POINT A]
  • 1 Stimmen
    1 Beiträge
    9 Aufrufe
    Niemand hat geantwortet
  • A global environmental standard for AI | Mistral

    Technology technology
    3
    1
    80 Stimmen
    3 Beiträge
    48 Aufrufe
    I
    The way they show their equivalence is very useful. The water and materials especially. Though the ghg is a little odd as streaming is in itself a complex web.
  • Why This Python Performance Trick Doesn’t Matter Anymore

    Technology technology
    4
    1
    68 Stimmen
    4 Beiträge
    60 Aufrufe
    S
    I'm surprised about the module lookup thing, since I assumed it was just syntax sugar to do from ... import .... We do the from syntax almost everywhere, but I've been replacing huge import blocks with a module import (e.g. constants) just to clean up the imports a bit and git conflicts. Looks like I'll need to keep this in mind until we upgrade to 3.13.
  • 37 Stimmen
    2 Beiträge
    32 Aufrufe
    P
    Idk if it’s content blocking on my end but I can’t tell you how upset I am that the article had no pictures of the contraption or a video of it in action.
  • No JS, No CSS, No HTML: online "clubs" celebrate plainer websites

    Technology technology
    205
    2
    772 Stimmen
    205 Beiträge
    7k Aufrufe
    R
    Gemini is just a web replacement protocol. With basic things we remember from olden days Web, but with everything non-essential removed, for a client to be doable in a couple of days. I have my own Gemini viewer, LOL. This for me seems a completely different application from torrents. I was dreaming for a thing similar to torrent trackers for aggregating storage and computation and indexing and search, with search and aggregation and other services' responses being structured and standardized, and cryptographic identities, and some kind of market services to sell and buy storage and computation in unified and pooled, but transparent way (scripted by buyer\seller), similar to MMORPG markets, with the representation (what is a siloed service in modern web) being on the client native application, and those services allowing to build any kind of client-server huge system on them, that being global. But that's more of a global Facebook\Usenet\whatever, a killer of platforms. Their infrastructure is internal, while their representation is public on the Internet. I want to make infrastructure public on the Internet, and representation client-side, sharing it for many kinds of applications. Adding another layer to the OSI model, so to say, between transport and application layer. For this application: I think you could have some kind of Kademlia-based p2p with groups voluntarily joined (involving very huge groups) where nodes store replicas of partitions of group common data based on their pseudo-random identifiers and/or some kind of ring built from those identifiers, to balance storage and resilience. If a group has a creator, then you can have replication factor propagated signed by them, and membership too signed by them. But if having a creator (even with cryptographically delegated decisions) and propagating changes by them is not ok, then maybe just using whole data hash, or it's bittorrent-like info tree hash, as namespace with peers freely joining it can do. Then it may be better to partition not by parts of the whole piece, but by info tree? I guess making it exactly bittorrent-like is not a good idea, rather some kind of block tree, like for a filesystem, and a separate piece of information to lookup which file is in which blocks. If we are doing directory structure. Then, with freely joining it, there's no need in any owners or replication factors, I guess just pseudorandom distribution of hashes will do, and each node storing first partitions closest to its hash. Now thinking about it, such a system would be not that different from bittorrent and can even be interoperable with it. There's the issue of updates, yes, hence I've started with groups having hierarchy of creators, who can make or accept those updates. Having that and the ability to gradually store one group's data to another group, it should be possible to do forks of a certain state. But that line of thought makes reusing bittorrent only possible for part of the system. The whole database is guaranteed to be more than a normal HDD (1 TB? I dunno). Absolutely guaranteed, no doubt at all. 1 TB (for example) would be someone's collection of favorite stuff, and not too rich one.
  • 1 Stimmen
    3 Beiträge
    45 Aufrufe
    B
    They’re trash because the entire rag is right-wing billionaire propaganda by design.
  • If you value privacy, ditch Chrome and switch to Firefox now

    Technology technology
    3
    7 Stimmen
    3 Beiträge
    50 Aufrufe
    B
    Why did firefox kill pwa support on desktop?