Skip to content

UK Government responded to the "Repeal the Online Safety Act" Petition.

Technology
46 43 0
  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    "Benefit"

    This word you keep using. I do not think it means what you think it means

  • "Oh no, what if our kids jack off, that would be horrible". If it's that bad, then take responsibility, try to convince them not to. Don't let the government parent for you, because you won't do it yourself. Fucking ridiculous. A similar bill was introduced in May in the US.

    "It's such a big issue that I'm going to do absolutely nothing as a parent to stop it from happening!"

    • These people, probably
  • As an American, I'm worried that Starmer's legacy is going to be giving you PM Nigel Farage.

    Starmer is Blair 2.0

  • "It's such a big issue that I'm going to do absolutely nothing as a parent to stop it from happening!"

    • These people, probably

    A tale as old as time. Banning media you don't like is a lot easier than parenting your children.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    For example, the Government is very
    concerned about small platforms that host
    harmful content, such as forums dedicated to
    encouraging suicide or self-harm.

    So they've identified a problem with this type of content, and the answer is to put it behind an age wall. So is it a-ok for anyone over 18 to be encouraged to self harm or commit suicide according to the government?

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Wait until the government finds out they're gonna have to age-restrict playing outside. What a genuine bone-dead stupid take.

  • Starmer is Blair 2.0

    Yall really need a real labor party. I mean so do us yanks, but y'all do too

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Another "lol no" response to a petition. Didn't see that coming.

  • Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Wait until the government finds out they're gonna have to age-restrict playing outside. What a genuine bone-dead stupid take.

    they'll need so many more cameras with built in AI face and gait recognition, and footage uploading to china. Otherwise how will they automatically fine any adult that goes within 5 meters of a children without written approval of both parents signed by a notary

  • For example, the Government is very
    concerned about small platforms that host
    harmful content, such as forums dedicated to
    encouraging suicide or self-harm.

    So they've identified a problem with this type of content, and the answer is to put it behind an age wall. So is it a-ok for anyone over 18 to be encouraged to self harm or commit suicide according to the government?

    This is most likely because of SaSu(a suicide forum). They’ve been fighting with the uk for a long time.

  • As an American, I'm worried that Starmer's legacy is going to be giving you PM Nigel Farage.

    Like Biden's legacy is fumbling the 2nd term so the US got Trump 2.0?

  • Yall really need a real labor party. I mean so do us yanks, but y'all do too

    Just as you can see happened with Mamdani, any genuine left candidate in the UK (Corbin) gets smeared and fought even by their own side. Whenever these candidates come out it shows that the 2 party system is an illusion, and all those in power seek to profit and rule.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Yeah the Brits needs to re-install their governement and flush out the royals, the lords and the other elite turds

  • Starmer is Blair 2.0

    Starmer is no where near Blair 2.0, Blair at least had charisma, a political plan and, at least before Iraq, genuinly had mass grass roots support. Starmer has none of them, he's an apolitical middle maneger who has been pushed to the top of the party by a right wing clique in labour as a way to purge the left.

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    If only they could have that response when the TERFs come knocking. When normal people want something good they're like "lol no get fucked losers" but when JK Rowling comes along they're like "Of course mistress anything you want do you want a viewing box at the gas chambers?"

  • Source.

    ::: spoiler Long Response

    I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

    The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

    Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

    Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

    Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

    • easy-to-find, understandable terms and conditions;
    • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
    • the ability to review content and take it down if it is illegal (or breaches their terms of service);
    • a specific individual responsible for compliance, who Ofcom can contact if needed.

    Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

    On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

    The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

    Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

    The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

    Department for Science, Innovation and Technology
    :::

    Govt easily has enough majority to ignore this, I said this when everyone was going it is time for Labour to have a go at the GE. Same govt is making it OK for 16 and 17yr old to vote but they have to be age checked on the tinternet....

  • "Benefit"

    This word you keep using. I do not think it means what you think it means

    The government will enact the will of the people, whether the people like it or not.

  • Yeah the Brits needs to re-install their governement and flush out the royals, the lords and the other elite turds

    I'm not sure the royals caused this. I guess the main issue is that some democracies become too entrenched, and groups of elites take over the role of nobility, term limits doesn't help, since to be in a position to become someone, you have to join those that already rule. Capitalism also doesn't help and even accelerates this process. Abolishing FPTP and instituting ranked choice would be the first step I think on improving democracies, by breaking up these elite groups.

  • Yall really need a real labor party. I mean so do us yanks, but y'all do too

    Corbin is starting one

  • This is most likely because of SaSu(a suicide forum). They’ve been fighting with the uk for a long time.

    What I don't get is why it's ok to view that at 18 but not at 17 years and 364 days. Surely just ban the site for everyone.

  • 236 Stimmen
    37 Beiträge
    29 Aufrufe
    K
    AI has some use but it always needs human oversight and the final decision must also be made by a human professional. If you use AI to speed up tasks and you know whether the output of the AI is valid or not, and you have the final decision, then you can safely use it. But if you let AI decide on and execute important tasks basically autonomously, then you have a recipe for disaster. Fully autonomous and mistake-free AI is a naive pipe dream which I don't see on the horizon at all.
  • 56 Stimmen
    2 Beiträge
    26 Aufrufe
    lyra_lycan@lemmy.blahaj.zoneL
    That's weird, I can still access all my main sites on the surface with no VPN. One site's mirror hasn't even changed domain. I still haven't seen any proof of these attacks on the legal side
  • 589 Stimmen
    120 Beiträge
    813 Aufrufe
    chickenandrice@sh.itjust.worksC
    Building a linux phone: do you mean from scratch, or just installing one of the Linux phone OS's that already exist? I've been following Ubuntu Touch for several years now and, while they have made a lot of progress, its main hurdles have the same thing in common: mobile hardware is incredibly locked down. For example, Ubuntu Touch uses proprietary Android drivers for many low level functions. Even then, there's some features that aren't stable across all devices, like VOLTE. It sucks, I really want to use Ubuntu Touch (or any of the Linux alternatives) but I can't make phone calls or text in the US without VOLTE support. There are a few phones that support VOLTE, but the feature is either in beta, the phone is expensive, or the phone is not sold in the US. Anyways bringing that back to Graphene: In my case, I'm using this as a stopgap until Linux phones take off (assuming they ever do). For now I guess the best thing is to just be skeptic, keep things minimal, and bloat-free.
  • 30 Stimmen
    2 Beiträge
    25 Aufrufe
    captainastronaut@seattlelunarsociety.orgC
    If you had asked me during the Obama administration I would have said this a chance of becoming law. Today I give it 0.002%.
  • Teachers Are Not OK

    Technology technology
    18
    1
    252 Stimmen
    18 Beiträge
    98 Aufrufe
    curious_canid@lemmy.caC
    AI is so far from being the main problem with our current US educational system that I'm not sure why we bother to talk about it. Until we can produce students who meet minimum standards for literacy and critical thinking, AI is a sideshow.
  • How LLMs could be insider threats

    Technology technology
    12
    1
    105 Stimmen
    12 Beiträge
    72 Aufrufe
    patatahooligan@lemmy.worldP
    Of course they're not "three laws safe". They're black boxes that spit out text. We don't have enough understanding and control over how they work to force them to comply with the three laws of robotics, and the LLMs themselves do not have the reasoning capability or the consistency to enforce them even if we prompt them to.
  • 112 Stimmen
    23 Beiträge
    128 Aufrufe
    exec@pawb.socialE
    I mean no more live view via the screen
  • Iran asks its people to delete WhatsApp

    Technology technology
    25
    1
    225 Stimmen
    25 Beiträge
    127 Aufrufe
    baduhai@sopuli.xyzB
    Communicate securely with WhatsApp? That's an oxymoron.