Skip to content

UK Government responded to the "Repeal the Online Safety Act" Petition.

Technology
68 60 761
  • "It's such a big issue that I'm going to do absolutely nothing as a parent to stop it from happening!"

    • These people, probably

    A tale as old as time. Banning media you don't like is a lot easier than parenting your children.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    For example, the Government is very
    concerned about small platforms that host
    harmful content, such as forums dedicated to
    encouraging suicide or self-harm.

    So they've identified a problem with this type of content, and the answer is to put it behind an age wall. So is it a-ok for anyone over 18 to be encouraged to self harm or commit suicide according to the government?

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Wait until the government finds out they're gonna have to age-restrict playing outside. What a genuine bone-dead stupid take.

  • Starmer is Blair 2.0

    Yall really need a real labor party. I mean so do us yanks, but y'all do too

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Another "lol no" response to a petition. Didn't see that coming.

  • Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Wait until the government finds out they're gonna have to age-restrict playing outside. What a genuine bone-dead stupid take.

    they'll need so many more cameras with built in AI face and gait recognition, and footage uploading to china. Otherwise how will they automatically fine any adult that goes within 5 meters of a children without written approval of both parents signed by a notary

  • For example, the Government is very
    concerned about small platforms that host
    harmful content, such as forums dedicated to
    encouraging suicide or self-harm.

    So they've identified a problem with this type of content, and the answer is to put it behind an age wall. So is it a-ok for anyone over 18 to be encouraged to self harm or commit suicide according to the government?

    This is most likely because of SaSu(a suicide forum). They’ve been fighting with the uk for a long time.

  • As an American, I'm worried that Starmer's legacy is going to be giving you PM Nigel Farage.

    Like Biden's legacy is fumbling the 2nd term so the US got Trump 2.0?

  • Yall really need a real labor party. I mean so do us yanks, but y'all do too

    Just as you can see happened with Mamdani, any genuine left candidate in the UK (Corbin) gets smeared and fought even by their own side. Whenever these candidates come out it shows that the 2 party system is an illusion, and all those in power seek to profit and rule.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Yeah the Brits needs to re-install their governement and flush out the royals, the lords and the other elite turds

  • Starmer is Blair 2.0

    Starmer is no where near Blair 2.0, Blair at least had charisma, a political plan and, at least before Iraq, genuinly had mass grass roots support. Starmer has none of them, he's an apolitical middle maneger who has been pushed to the top of the party by a right wing clique in labour as a way to purge the left.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    If only they could have that response when the TERFs come knocking. When normal people want something good they're like "lol no get fucked losers" but when JK Rowling comes along they're like "Of course mistress anything you want do you want a viewing box at the gas chambers?"

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Govt easily has enough majority to ignore this, I said this when everyone was going it is time for Labour to have a go at the GE. Same govt is making it OK for 16 and 17yr old to vote but they have to be age checked on the tinternet....

  • "Benefit"

    This word you keep using. I do not think it means what you think it means

    The government will enact the will of the people, whether the people like it or not.

  • Yeah the Brits needs to re-install their governement and flush out the royals, the lords and the other elite turds

    I'm not sure the royals caused this. I guess the main issue is that some democracies become too entrenched, and groups of elites take over the role of nobility, term limits doesn't help, since to be in a position to become someone, you have to join those that already rule. Capitalism also doesn't help and even accelerates this process. Abolishing FPTP and instituting ranked choice would be the first step I think on improving democracies, by breaking up these elite groups.

  • Yall really need a real labor party. I mean so do us yanks, but y'all do too

    Corbin is starting one

  • This is most likely because of SaSu(a suicide forum). They’ve been fighting with the uk for a long time.

    What I don't get is why it's ok to view that at 18 but not at 17 years and 364 days. Surely just ban the site for everyone.

  • Disgusting. Violating everyone's rights and privacy "for the children" (who will just use VPNs anyway).

    Not if they end up banning VPNs, which is already something being discussed. If that happens, I might genuinely leave the country.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Dead Kennedys - Kinky sex makes the world go round

    Even better are the moans from the PM

  • Yeah the Brits needs to re-install their governement and flush out the royals, the lords and the other elite turds

    Honestly, half the time the hereditary peers have been the only protection against our government passing totally shit legislation, as the rest of the upper house is packed with useless political appointees. Not to say I wouldn’t replace the upper house with some sort of body of citizens similar to Jury Duty or somesuch to filter out the crazy.

  • How we Rooted Copilot

    Technology technology
    15
    1
    154 Stimmen
    15 Beiträge
    67 Aufrufe
    ?
    Surely there wasn't an exploit on the half a year out of date kernel (Article screenshots from April 2025, uname kernel release from a CBL-Mariner released September 3rd 2024).
  • The most Microsoft support document of all time – OSnews

    Technology technology
    6
    30 Stimmen
    6 Beiträge
    28 Aufrufe
    C
    You can just remove the "feels like" part. They were bloated and badly made.
  • TikTok Is Reportedly Making a U.S. Version of the App

    Technology technology
    7
    1
    28 Stimmen
    7 Beiträge
    86 Aufrufe
    abbiistabbii@lemmy.blahaj.zoneA
    So basically doing what Western Tech companies do in China? Just sounds like a way to isolate American users and control what they see and hear...a bit like what they do in China...huh.
  • The Really Dark Truth About Bots

    Technology technology
    5
    84 Stimmen
    5 Beiträge
    67 Aufrufe
    plutoniumacid@lemmy.worldP
    "Engineers" a.k.a. uneducated cubicle slaves
  • Misogyny and Violent Extremism: Can Big Tech Fix the Glitch?

    Technology technology
    18
    1
    20 Stimmen
    18 Beiträge
    196 Aufrufe
    G
    It is interesting that you are not answering my point... Good work
  • 37 Stimmen
    2 Beiträge
    29 Aufrufe
    P
    Idk if it’s content blocking on my end but I can’t tell you how upset I am that the article had no pictures of the contraption or a video of it in action.
  • Welcome to the web we lost

    Technology technology
    22
    1
    181 Stimmen
    22 Beiträge
    202 Aufrufe
    C
    Is it though? Its always far easier to be loud and obnoxious than do something constructive, even with the internet and LLMs, in fact those things are amplifiers which if anything make the attention imbalance even more drastic and unrepresentative of actual human behaviour. In the time it takes me to write this comment some troll can write a dozen hateful ones, or a bot can write a thousand. Doesn't mean humans are shitty in a 1000/1 ratio, just means shitty people can now be a thousand times louder.
  • 478 Stimmen
    81 Beiträge
    1k Aufrufe
    douglasg14b@lemmy.worldD
    Did I say that it did? No? Then why the rhetorical question for something that I never stated? Now that we're past that, I'm not sure if I think it's okay, but I at least recognize that it's normalized within society. And has been for like 70+ years now. The problem happens with how the data is used, and particularly abused. If you walk into my store, you expect that I am monitoring you. You expect that you are on camera and that your shopping patterns, like all foot traffic, are probably being analyzed and aggregated. What you buy is tracked, at least in aggregate, by default really, that's just volume tracking and prediction. Suffice to say that broad customer behavior analysis has been a thing for a couple generations now, at least. When you go to a website, why would you think that it is not keeping track of where you go and what you click on in the same manner? Now that I've stated that I do want to say that the real problems that we experience come in with how this data is misused out of what it's scope should be. And that we should have strong regulatory agencies forcing compliance of how this data is used and enforcing the right to privacy for people that want it removed.