Skip to content

UK Government responded to the "Repeal the Online Safety Act" Petition.

Technology
68 60 756
  • Yeah the Brits needs to re-install their governement and flush out the royals, the lords and the other elite turds

    Honestly, half the time the hereditary peers have been the only protection against our government passing totally shit legislation, as the rest of the upper house is packed with useless political appointees. Not to say I wouldn’t replace the upper house with some sort of body of citizens similar to Jury Duty or somesuch to filter out the crazy.

  • Not if they end up banning VPNs, which is already something being discussed. If that happens, I might genuinely leave the country.

    Russian here; good fucking luck banning VPNs

    First, they evolve very rapidly and are able to evade even the most sensitive detection methods Russia and China are using

    Second, people in power never want to apply the same restrictions to themselves, so, ironically, they themselves are often VPN users and as such they undermine themselves

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Protections? Is that what we're calling state-sanctioned censorship these days?

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    When the regime ignores petitions by the public for the redress of grievances, you petition harder.

    Demonstrations, Public Disobedience, Mischief, Sabotage, Terrorism.

    Censorship always expands and encroaches on things important to the public. Obscenity and indecency protections eventually turn into queer erasure. Security concerns are always followed by carve-outs of civil rights.

    Hit hard early.

  • If only they could have that response when the TERFs come knocking. When normal people want something good they're like "lol no get fucked losers" but when JK Rowling comes along they're like "Of course mistress anything you want do you want a viewing box at the gas chambers?"

    You’d think if they are going to being censoring things, they’d start with the full on Holocaust denial that Rowling has normalized.

  • When the regime ignores petitions by the public for the redress of grievances, you petition harder.

    Demonstrations, Public Disobedience, Mischief, Sabotage, Terrorism.

    Censorship always expands and encroaches on things important to the public. Obscenity and indecency protections eventually turn into queer erasure. Security concerns are always followed by carve-outs of civil rights.

    Hit hard early.

    And when peaceful protest is ignored there's even a step after that!

  • Yall really need a real labor party. I mean so do us yanks, but y'all do too

    I think you can include all G20 states there.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    "You plebs should know your place"

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    I'm surprised their response wasn't, "you're all on the side of predators; it's as simple as that".

  • Corbin is starting one

    I hope he can overtake reform in popularity

  • When the regime ignores petitions by the public for the redress of grievances, you petition harder.

    Demonstrations, Public Disobedience, Mischief, Sabotage, Terrorism.

    Censorship always expands and encroaches on things important to the public. Obscenity and indecency protections eventually turn into queer erasure. Security concerns are always followed by carve-outs of civil rights.

    Hit hard early.

    Isn't Britain the homeland of punk?

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    So peaceful words aren't working? Well alright, sounds like it's time for plan B.

  • What I don't get is why it's ok to view that at 18 but not at 17 years and 364 days. Surely just ban the site for everyone.

    This was never about age. It's about having to identify yourself to use the internet.

  • And when peaceful protest is ignored there's even a step after that!

    Writing even more sternly worded letters?

  • When the regime ignores petitions by the public for the redress of grievances, you petition harder.

    Demonstrations, Public Disobedience, Mischief, Sabotage, Terrorism.

    Censorship always expands and encroaches on things important to the public. Obscenity and indecency protections eventually turn into queer erasure. Security concerns are always followed by carve-outs of civil rights.

    Hit hard early.

  • Russian here; good fucking luck banning VPNs

    First, they evolve very rapidly and are able to evade even the most sensitive detection methods Russia and China are using

    Second, people in power never want to apply the same restrictions to themselves, so, ironically, they themselves are often VPN users and as such they undermine themselves

    Republican gay sex

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Fuck them.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    The TL;DR of it is "we're not repealing it, get fucked."

    The technology minister said on Sky News that if you was to repeal the act, you're on the side of Jimmy Savile.

    That's where we are in all this, support this or you're a nonce.

  • Isn't Britain the homeland of punk?

    Why do you think Punk developed?

  • Well we all saw that coming.

    The parental and elderly voting bloc is very hard to ignore, and those groups tend to be less privacy conscious (as well as pro-anything "protect the children").

    The only way it's getting repealed is if enough labour voters raise a fuss. Given Reform's messaging (i.e. repeal it) and how worried Labour are over Reform's polling, that is likely the only lever that'll work. However, that's a long game - one that will take years to play out.

    i think the "voting block" is just the convenient excuse they want to control you.