“The [UK] Online Safety Act risks silencing a generation."

It's “a fundamental encroachment on the right to freedom of expression to impart and receive information. Older teenagers engaging with political debates or current events may be cut off from vital sources of information because platforms will err on the side of removal to avoid hefty penalties.”

🗣️ ORG's @JamesBaker

https://www.express.co.uk/news/politics/2089661/online-safety-law-watchdog

#OnlineSafetyAct #onlinesafety#OSA #freespeech #ukpolitics #ukpol

@openrightsgroup @JamesBaker

Great point.

There should be public outrage at some of these nonsensical age gated examples - and the companies making those decisions must clarify their reasoning.

Only then can we separate true legal issues (which can be improved) from bad faith corporate behaviour.

For example, I'm sure Reddit has dozens of Gaza subreddits. Why was only one chosen for age gating?

#ImproveOSADontRepeal

@TCatInReality @openrightsgroup I think this is because if they don’t ban particularly subs, they have to somehow analyse and make a content classification judgement about every post. At scale that can only be done with automated algorithms or AI methods. That can lead to even more censorship as algorithms frequently wrongly flag content. X has taken that approach and it’s also causing problems.
@TCatInReality @openrightsgroup Realsitically you can’t say a massive online forum is going to be low risk or unlikely to be accessed by children. Although the act isn’t prescriptive at how a platform manages risk, it leaves them with little practical options. All Ofcom examples of risk management suggest things like age-gating, automated pro-active technologies etc
@JamesBaker @openrightsgroup

Correct me if I'm wrong, but the act has two routes for legal (but age sensitive) content:

1) effective moderation to remove age-sensitive content or
2) age gating, and allowing the content

So providers need to pick a lane based on the type of content, their intended audience and their ability to enforce.

Have I got that right?

@TCatInReality @openrightsgroup you are right content moderation is one thing in the code (PCU C1) but there are many more requirements then that - they will also extend after the latest update to these codes that are under consultation. So when presented with all those requirements many are instead opting to just age-gate away all under 18s
2+ more replies (not shown)
@TCatInReality @openrightsgroup Having a quick appeals process and independent third party adjudication for disputes around blocked and censored content would be one improvement- our report has a load more suggestions to fix it https://www.openrightsgroup.org/publications/how-to-fix-the-online-safety-act-a-rights-first-approach/

Algorithms will hyperactively dredge feeds of what's 'illegal' and 'harmful'.

Both terms are wishy washy in the UK Online Safety Act.

So platforms will over-moderate rather than get hit with penalties for a finger in the air judgement. Censorship is baked into the equation, and that's why the OSA threatens free expression.

Tell your MP to FIX IT ⬇️

https://action.openrightsgroup.org/tell-your-mp-online-safety-act-isn%E2%80%99t-working

#OnlineSafetyAct #onlinesafety#OSA #freespeech #ukpolitics #ukpol #freeexpression #privacy #agegate #ageverification