• ℹ️ Heads up...

    This is a popular topic that is fast moving Guest - before posting, please ensure that you check out the first post in the topic for a quick reminder of guidelines, and importantly a summary of the known facts and information so far. Thanks.

Online Safety Act 2023 & TST

GooseOnTheLoose

TS Member
Favourite Ride
Ug Bugs
Perhaps there could be a wider discussion about the general implications that the Online Safety Act will bring for online communities, but I specifically wanted to focus the discussion on the potential impact it could have to our dear TST.

In March 2025 the Illegal Harms Codes of Practice comes into force and websites which host user generated content, and have UK activities, will have to comply.

It is increasingly apparent that the Online Safety Act was written not only by people who presume that the web is the internet, but that massive sites like Facebook, X and YouTube are the internet. Consequently there's very little nuance or provision given in the act for graded compliance for community discussion boards like TST.

Over the past few weeks we've seen similar discussion boards, and community spaces, announce that they're shutting up shop as they fear they won't be able to afford the compliance costs. I worry about the future of TST and how we can help.

TST Discord will be relatively unaffected. As the discussion is hosted on Discord, it is likely that Discord will be responsible for legal and compliance for all of its servers. My concern is for this very forum, as it's self hosted and independently owned.

I'm sure the moderators are having their own discussions about the impact of the Online Safety Act may have on TST, but I think it's equally important for passionate community members to have a place to discuss this and offer solutions too. It could be that a few of us are willing to club together and offer a regular donation to help with compliance costs.

A lot is unknown at the moment, whilst the Act clearly applies to discussion forums like TST, we're unsure how Ofcom are going to police it. The system is open to abuse from bad actors, it's all very messy. Unfortunately though it's no longer a hypothetical in the distance, March is very much around the corner.

For anyone that wants to read further into the Online Safety Act, and how it could impact TST, please look at Ofcom's first policy statement:
 
Last edited:
Perhaps there could be a wider discussion about the general implications that the Online Safety Act will bring for online communities, but I specifically wanted to focus the discussion on the potential impact it could have to our dear TST.

In March 2025 the Illegal Harms Codes of Practice comes into force and websites which host user generated content, and have UK activities, will have to comply.

It is increasingly apparent that the Online Safety Act was written not only by people who presume that the web is the internet, but that massive sites like Facebook, X and YouTube are the internet. Consequently there's very little nuance or provision given in the act for graded compliance for community discussion boards like TST.

Over the past few weeks we've seen similar discussion boards, and community spaces, announce that they're shutting up shop as they fear they won't be able to afford the compliance costs. I worry about the future of TST and how we can help.

TST Discord will be relatively unaffected. As the discussion is hosted on Discord, it is likely that Discord will be responsible for legal and compliance for all of its servers. My concern is for this very forum, as it's self hosted and independently owned.

I'm sure the moderators are having their own discussions about the impact of the Online Safety Act may have on TST, but I think it's equally important for passionate community members to have a place to discuss this and offer solutions too. It could be that a few of us are willing to club together and offer a regular donation to help with compliance costs.

A lot is unknown at the moment, whilst the Act clearly applies to discussion forums like TST, we're unsure how Ofcom are going to police it. The system is open to abuse from bad actors, it's all very messy. Unfortunately though it's no longer a hypothetical in the distance, March is very much around the corner.

For anyone that wants to read further into the Online Safety Act, and how it could impact TST, please look at Ofcom's first policy statement:
That will also affect the TST Discord server as will, as Discord is covered by the Act.
 
Out of interest @GooseOnTheLoose, where is it you feel additional costs will come from?

Correct me if I’m wrong, but from reading that Ofcom link, it sounds as though the main difference is that service providers will need to fill out a risk assessment saying how they’re preventing things like minors being exposed to adult material, hate speech, CSA material etc. I apologise if I sound ignorant, but where would extra costs come from here? Is it merely the time to fill out these risk assessments and maybe do some slightly more heavy-handed moderation, or would sites like this one require an additional role such as “compliance officer” or similar to be employed?

It does also sound as though there’s some degree of graduation according to that link, with differentiation between “large service providers” and “medium/small service providers” (I assume TST would fall into the latter category) specified.
That will also affect the TST Discord server as will, as Discord is covered by the Act.
As Goose explained, the difference between TST Discord and the main site is that Discord would be accountable for compliance on the TST Discord, as they are the service providers, whereas the site team would be accountable for compliance on the main site as it is independently managed.
 
As Goose explained, the difference between TST Discord and the main site is that Discord would be accountable for compliance on the TST Discord, as they are the service providers, whereas the site team would be accountable for compliance on the main site as it is independently managed.
Ah I somehow completely missed that. thanks for clarifying.
 
For some further food for thought, here’s a thread on the XenForo forums (TST’s forum provider): https://xenforo.com/community/threads/uk-online-safety-regulations-and-impact-on-forums.227661/

A key point to note is that automated content scanning, one of the key required changes enforced by the bill, is not required if the site has less than 700,000 active monthly UK users (which TST does, I believe).
I don't think DM's can be moderated on TS though, and small sites that have less than 700,000 active users must have content moderation, which I assume includes private DMs.
 
Out of interest @GooseOnTheLoose, where is it you feel additional costs will come from?

Correct me if I’m wrong, but from reading that Ofcom link, it sounds as though the main difference is that service providers will need to fill out a risk assessment saying how they’re preventing things like minors being exposed to adult material, hate speech, CSA material etc. I apologise if I sound ignorant, but where would extra costs come from here? Is it merely the time to fill out these risk assessments and maybe do some slightly more heavy-handed moderation, or would sites like this one require an additional role such as “compliance officer” or similar to be employed?

It does also sound as though there’s some degree of graduation according to that link, with differentiation between “large service providers” and “medium/small service providers” (I assume TST would fall into the latter category) specified.

As Goose explained, the difference between TST Discord and the main site is that Discord would be accountable for compliance on the TST Discord, as they are the service providers, whereas the site team would be accountable for compliance on the main site as it is independently managed.
Ofcom estimates the cost of implementing measures for smaller user-to-user services to be in the range of £3,000 to £7,000. I would imagine that this would include legal and administrative fees, to ensure that new policies are compliant with the new legislation.

The Act also suggests that the passive moderation which TST uses, where users report issues which are then acted upon, isn't sufficient. Active moderation will need to take place, I would imagine that anyone undertaking this responsibility would probably quite like to be financially renumerated for it. A compliance officer, or at least a nominated person, might be required.

Age Verification could pose increased costs, with third party verification tools likely to be required. Whilst the main focus on the OSA's Age Verification requirements is for pornographic content, content which is legal but considered harmful also falls into this. Discussions and content promoting suicide, eating disorders, self harm and hate speech will will require age verification. Considering that sub threads like The Tavern are 18+ "anything goes", it's likely that some of the discussions here active, or previous, will be affected and subject to age verification.

Considering that the type of content typically discussed on TST, namely themed attractions and rides, is particularly of interest to children, families and younger people, it's possible that it would face deeper scrutiny than other discussion forums.

Compliance also isn't just for current content or future content, it's for content from years previously gone by. It's a possibility that moderators will have to go through discussions from years gone by and flag / remove problematic content, or content which isn't compliant with the new legislation. - On a slightly lighter note, luckily for them @Zeock has read the entire thing and might serve as an initial reference point.

Without wishing to open past wounds or arguments, there has been previous discussion and threads about grooming allegedly taking place on TST, or as a result of meeting on TST. The Act makes it a legal responsibility to report allegations of child sexual exploitation and abuse, and have robust procedures for the moderation, discouraging and reporting of such allegations.

The fines/consequences for any potential breaches, or found non-compliance, are also worth considering. The moderators and "owners" of TST suddenly become personally liable and responsible for the content published, which is a fundamental change in how the internet has previously worked. It's a possibility that this increased risk, pressure and responsibility isn't considered worth it, which would be a massive shame.
 
Ofcom estimates the cost of implementing measures for smaller user-to-user services to be in the range of £3,000 to £7,000. I would imagine that this would include legal and administrative fees, to ensure that new policies are compliant with the new legislation.

The Act also suggests that the passive moderation which TST uses, where users report issues which are then acted upon, isn't sufficient. Active moderation will need to take place, I would imagine that anyone undertaking this responsibility would probably quite like to be financially renumerated for it. A compliance officer, or at least a nominated person, might be required.

Age Verification could pose increased costs, with third party verification tools likely to be required. Whilst the main focus on the OSA's Age Verification requirements is for pornographic content, content which is legal but considered harmful also falls into this. Discussions and content promoting suicide, eating disorders, self harm and hate speech will will require age verification. Considering that sub threads like The Tavern are 18+ "anything goes", it's likely that some of the discussions here active, or previous, will be affected and subject to age verification.

Considering that the type of content typically discussed on TST, namely themed attractions and rides, is particularly of interest to children, families and younger people, it's possible that it would face deeper scrutiny than other discussion forums.

Compliance also isn't just for current content or future content, it's for content from years previously gone by. It's a possibility that moderators will have to go through discussions from years gone by and flag / remove problematic content, or content which isn't compliant with the new legislation. - On a slightly lighter note, luckily for them @Zeock has read the entire thing and might serve as an initial reference point.

Without wishing to open past wounds or arguments, there has been previous discussion and threads about grooming allegedly taking place on TST, or as a result of meeting on TST. The Act makes it a legal responsibility to report allegations of child sexual exploitation and abuse, and have robust procedures for the moderation, discouraging and reporting of such allegations.

The fines/consequences for any potential breaches, or found non-compliance, are also worth considering. The moderators and "owners" of TST suddenly become personally liable and responsible for the content published, which is a fundamental change in how the internet has previously worked. It's a possibility that this increased risk, pressure and responsibility isn't considered worth it, which would be a massive shame.
Blimey… I fully get the need for protecting children online and such, and the current law is clearly insufficient at doing so, but this all sounds awfully draconian, far more so than I was previously aware of. I never knew the Online Safety Bill imposed such wide-ranging and stringent requirements.

If TST has to change as a result of this legislation, most of the internet will have to make huge changes as a result of this…
 
Blimey… I fully get the need for protecting children online and such, and the current law is clearly insufficient at doing so, but this all sounds awfully draconian, far more so than I was previously aware of. I never knew the Online Safety Bill imposed such wide-ranging and stringent requirements.

If TST has to change as a result of this legislation, most of the internet will have to make huge changes as a result of this…
It's wide ranging and far reaching sledgehammer legislation, which many tech companies, platforms, thought leaders and civil liberties groups have been expressing concerns about for a few years.

If this were legislation introduced by the US or the EU, I'd be more inclined to agree with your assessment of it changing the internet entirely. Since this is UK legislation only, and we're a relatively small market of 70 million people, what we'll likely see is a partitioning of the internet. Features won't be available to UK users, alternative versions will be served to UK users, or sites/services will refuse to operate here entirely.
 
I'm worried about the moderation becoming more hands-on. It would result in far more work for the moderators and could also lead to an increase in authoritarianism on this site. In contrast, I like that here at the moment, the moderation is relaxed to a certain degree, where mainly essential moderation (such as deleting spam and off-topic posts) seems to be prioritised.
 
Top