• ℹ️ Heads up...

    This is a popular topic that is fast moving Guest - before posting, please ensure that you check out the first post in the topic for a quick reminder of guidelines, and importantly a summary of the known facts and information so far. Thanks.

Online Safety Act 2023 & TST

GooseOnTheLoose

TS Member
Favourite Ride
Ug Bugs
Perhaps there could be a wider discussion about the general implications that the Online Safety Act will bring for online communities, but I specifically wanted to focus the discussion on the potential impact it could have to our dear TST.

In March 2025 the Illegal Harms Codes of Practice comes into force and websites which host user generated content, and have UK activities, will have to comply.

It is increasingly apparent that the Online Safety Act was written not only by people who presume that the web is the internet, but that massive sites like Facebook, X and YouTube are the internet. Consequently there's very little nuance or provision given in the act for graded compliance for community discussion boards like TST.

Over the past few weeks we've seen similar discussion boards, and community spaces, announce that they're shutting up shop as they fear they won't be able to afford the compliance costs. I worry about the future of TST and how we can help.

TST Discord will be relatively unaffected. As the discussion is hosted on Discord, it is likely that Discord will be responsible for legal and compliance for all of its servers. My concern is for this very forum, as it's self hosted and independently owned.

I'm sure the moderators are having their own discussions about the impact of the Online Safety Act may have on TST, but I think it's equally important for passionate community members to have a place to discuss this and offer solutions too. It could be that a few of us are willing to club together and offer a regular donation to help with compliance costs.

A lot is unknown at the moment, whilst the Act clearly applies to discussion forums like TST, we're unsure how Ofcom are going to police it. The system is open to abuse from bad actors, it's all very messy. Unfortunately though it's no longer a hypothetical in the distance, March is very much around the corner.

For anyone that wants to read further into the Online Safety Act, and how it could impact TST, please look at Ofcom's first policy statement:
 
Last edited:
Perhaps there could be a wider discussion about the general implications that the Online Safety Act will bring for online communities, but I specifically wanted to focus the discussion on the potential impact it could have to our dear TST.

In March 2025 the Illegal Harms Codes of Practice comes into force and websites which host user generated content, and have UK activities, will have to comply.

It is increasingly apparent that the Online Safety Act was written not only by people who presume that the web is the internet, but that massive sites like Facebook, X and YouTube are the internet. Consequently there's very little nuance or provision given in the act for graded compliance for community discussion boards like TST.

Over the past few weeks we've seen similar discussion boards, and community spaces, announce that they're shutting up shop as they fear they won't be able to afford the compliance costs. I worry about the future of TST and how we can help.

TST Discord will be relatively unaffected. As the discussion is hosted on Discord, it is likely that Discord will be responsible for legal and compliance for all of its servers. My concern is for this very forum, as it's self hosted and independently owned.

I'm sure the moderators are having their own discussions about the impact of the Online Safety Act may have on TST, but I think it's equally important for passionate community members to have a place to discuss this and offer solutions too. It could be that a few of us are willing to club together and offer a regular donation to help with compliance costs.

A lot is unknown at the moment, whilst the Act clearly applies to discussion forums like TST, we're unsure how Ofcom are going to police it. The system is open to abuse from bad actors, it's all very messy. Unfortunately though it's no longer a hypothetical in the distance, March is very much around the corner.

For anyone that wants to read further into the Online Safety Act, and how it could impact TST, please look at Ofcom's first policy statement:
That will also affect the TST Discord server as will, as Discord is covered by the Act.
 
Out of interest @GooseOnTheLoose, where is it you feel additional costs will come from?

Correct me if I’m wrong, but from reading that Ofcom link, it sounds as though the main difference is that service providers will need to fill out a risk assessment saying how they’re preventing things like minors being exposed to adult material, hate speech, CSA material etc. I apologise if I sound ignorant, but where would extra costs come from here? Is it merely the time to fill out these risk assessments and maybe do some slightly more heavy-handed moderation, or would sites like this one require an additional role such as “compliance officer” or similar to be employed?

It does also sound as though there’s some degree of graduation according to that link, with differentiation between “large service providers” and “medium/small service providers” (I assume TST would fall into the latter category) specified.
That will also affect the TST Discord server as will, as Discord is covered by the Act.
As Goose explained, the difference between TST Discord and the main site is that Discord would be accountable for compliance on the TST Discord, as they are the service providers, whereas the site team would be accountable for compliance on the main site as it is independently managed.
 
As Goose explained, the difference between TST Discord and the main site is that Discord would be accountable for compliance on the TST Discord, as they are the service providers, whereas the site team would be accountable for compliance on the main site as it is independently managed.
Ah I somehow completely missed that. thanks for clarifying.
 
For some further food for thought, here’s a thread on the XenForo forums (TST’s forum provider): https://xenforo.com/community/threads/uk-online-safety-regulations-and-impact-on-forums.227661/

A key point to note is that automated content scanning, one of the key required changes enforced by the bill, is not required if the site has less than 700,000 active monthly UK users (which TST does, I believe).
I don't think DM's can be moderated on TS though, and small sites that have less than 700,000 active users must have content moderation, which I assume includes private DMs.
 
Out of interest @GooseOnTheLoose, where is it you feel additional costs will come from?

Correct me if I’m wrong, but from reading that Ofcom link, it sounds as though the main difference is that service providers will need to fill out a risk assessment saying how they’re preventing things like minors being exposed to adult material, hate speech, CSA material etc. I apologise if I sound ignorant, but where would extra costs come from here? Is it merely the time to fill out these risk assessments and maybe do some slightly more heavy-handed moderation, or would sites like this one require an additional role such as “compliance officer” or similar to be employed?

It does also sound as though there’s some degree of graduation according to that link, with differentiation between “large service providers” and “medium/small service providers” (I assume TST would fall into the latter category) specified.

As Goose explained, the difference between TST Discord and the main site is that Discord would be accountable for compliance on the TST Discord, as they are the service providers, whereas the site team would be accountable for compliance on the main site as it is independently managed.
Ofcom estimates the cost of implementing measures for smaller user-to-user services to be in the range of £3,000 to £7,000. I would imagine that this would include legal and administrative fees, to ensure that new policies are compliant with the new legislation.

The Act also suggests that the passive moderation which TST uses, where users report issues which are then acted upon, isn't sufficient. Active moderation will need to take place, I would imagine that anyone undertaking this responsibility would probably quite like to be financially renumerated for it. A compliance officer, or at least a nominated person, might be required.

Age Verification could pose increased costs, with third party verification tools likely to be required. Whilst the main focus on the OSA's Age Verification requirements is for pornographic content, content which is legal but considered harmful also falls into this. Discussions and content promoting suicide, eating disorders, self harm and hate speech will will require age verification. Considering that sub threads like The Tavern are 18+ "anything goes", it's likely that some of the discussions here active, or previous, will be affected and subject to age verification.

Considering that the type of content typically discussed on TST, namely themed attractions and rides, is particularly of interest to children, families and younger people, it's possible that it would face deeper scrutiny than other discussion forums.

Compliance also isn't just for current content or future content, it's for content from years previously gone by. It's a possibility that moderators will have to go through discussions from years gone by and flag / remove problematic content, or content which isn't compliant with the new legislation. - On a slightly lighter note, luckily for them @Zeock has read the entire thing and might serve as an initial reference point.

Without wishing to open past wounds or arguments, there has been previous discussion and threads about grooming allegedly taking place on TST, or as a result of meeting on TST. The Act makes it a legal responsibility to report allegations of child sexual exploitation and abuse, and have robust procedures for the moderation, discouraging and reporting of such allegations.

The fines/consequences for any potential breaches, or found non-compliance, are also worth considering. The moderators and "owners" of TST suddenly become personally liable and responsible for the content published, which is a fundamental change in how the internet has previously worked. It's a possibility that this increased risk, pressure and responsibility isn't considered worth it, which would be a massive shame.
 
Ofcom estimates the cost of implementing measures for smaller user-to-user services to be in the range of £3,000 to £7,000. I would imagine that this would include legal and administrative fees, to ensure that new policies are compliant with the new legislation.

The Act also suggests that the passive moderation which TST uses, where users report issues which are then acted upon, isn't sufficient. Active moderation will need to take place, I would imagine that anyone undertaking this responsibility would probably quite like to be financially renumerated for it. A compliance officer, or at least a nominated person, might be required.

Age Verification could pose increased costs, with third party verification tools likely to be required. Whilst the main focus on the OSA's Age Verification requirements is for pornographic content, content which is legal but considered harmful also falls into this. Discussions and content promoting suicide, eating disorders, self harm and hate speech will will require age verification. Considering that sub threads like The Tavern are 18+ "anything goes", it's likely that some of the discussions here active, or previous, will be affected and subject to age verification.

Considering that the type of content typically discussed on TST, namely themed attractions and rides, is particularly of interest to children, families and younger people, it's possible that it would face deeper scrutiny than other discussion forums.

Compliance also isn't just for current content or future content, it's for content from years previously gone by. It's a possibility that moderators will have to go through discussions from years gone by and flag / remove problematic content, or content which isn't compliant with the new legislation. - On a slightly lighter note, luckily for them @Zeock has read the entire thing and might serve as an initial reference point.

Without wishing to open past wounds or arguments, there has been previous discussion and threads about grooming allegedly taking place on TST, or as a result of meeting on TST. The Act makes it a legal responsibility to report allegations of child sexual exploitation and abuse, and have robust procedures for the moderation, discouraging and reporting of such allegations.

The fines/consequences for any potential breaches, or found non-compliance, are also worth considering. The moderators and "owners" of TST suddenly become personally liable and responsible for the content published, which is a fundamental change in how the internet has previously worked. It's a possibility that this increased risk, pressure and responsibility isn't considered worth it, which would be a massive shame.
Blimey… I fully get the need for protecting children online and such, and the current law is clearly insufficient at doing so, but this all sounds awfully draconian, far more so than I was previously aware of. I never knew the Online Safety Bill imposed such wide-ranging and stringent requirements.

If TST has to change as a result of this legislation, most of the internet will have to make huge changes as a result of this…
 
Blimey… I fully get the need for protecting children online and such, and the current law is clearly insufficient at doing so, but this all sounds awfully draconian, far more so than I was previously aware of. I never knew the Online Safety Bill imposed such wide-ranging and stringent requirements.

If TST has to change as a result of this legislation, most of the internet will have to make huge changes as a result of this…
It's wide ranging and far reaching sledgehammer legislation, which many tech companies, platforms, thought leaders and civil liberties groups have been expressing concerns about for a few years.

If this were legislation introduced by the US or the EU, I'd be more inclined to agree with your assessment of it changing the internet entirely. Since this is UK legislation only, and we're a relatively small market of 70 million people, what we'll likely see is a partitioning of the internet. Features won't be available to UK users, alternative versions will be served to UK users, or sites/services will refuse to operate here entirely.
 
I'm worried about the moderation becoming more hands-on. It would result in far more work for the moderators and could also lead to an increase in authoritarianism on this site. In contrast, I like that here at the moment, the moderation is relaxed to a certain degree, where mainly essential moderation (such as deleting spam and off-topic posts) seems to be prioritised.
 
The Act also suggests that the passive moderation which TST uses, where users report issues which are then acted upon, isn't sufficient. Active moderation will need to take place, I would imagine that anyone undertaking this responsibility would probably quite like to be financially renumerated for it. A compliance officer, or at least a nominated person, might be required.
I would argue we have active moderation, there are the TS team who frequent this fourm (most the time there is at least one team online) where presumably they would be able to spot and remove any harmful content, I would also imagine Towers street has a compliance officer for GDPR (although I am not sure if for the fourms part it is XenForo runs that part as well)
Age Verification could pose increased costs, with third party verification tools likely to be required. Whilst the main focus on the OSA's Age Verification requirements is for pornographic content, content which is legal but considered harmful also falls into this. Discussions and content promoting suicide, eating disorders, self harm and hate speech will will require age verification. Considering that sub threads like The Tavern are 18+ "anything goes", it's likely that some of the discussions here active, or previous, will be affected and subject to age verification.

Considering that the type of content typically discussed on TST, namely themed attractions and rides, is particularly of interest to children, families and younger people, it's possible that it would face deeper scrutiny than other discussion forums.
Age verification is an intresting one, I am unsure on XenForo's ability to add sections but if possible I would imagine a fix would be to make that area require age verification. if not I am quite unsure on what would be considered harmful, a post about what peopls favourate drinks are could be, drinking is dangerous, a thread on how to stop smoking well that is discussing harmfull acts.

Blimey… I fully get the need for protecting children online and such, and the current law is clearly insufficient at doing so, but this all sounds awfully draconian, far more so than I was previously aware of. I never knew the Online Safety Bill imposed such wide-ranging and stringent requirements.

If TST has to change as a result of this legislation, most of the internet will have to make huge changes as a result of this…
I agree, I actually think it will make problems worse on the internet because who would give over ID to suspicious websites resulting in people seeking out more back room websites which potentially dosn't moderate its content for very illegal stuff.
 
I would argue we have active moderation, there are the TS team who frequent this fourm (most the time there is at least one team online) where presumably they would be able to spot and remove any harmful content, I would also imagine Towers street has a compliance officer for GDPR (although I am not sure if for the fourms part it is XenForo runs that part as well)

The team have full time jobs though so can't be online all the time. The changes might force them to add more moderators so there's more chance of at least a couple of team members being online at any time.
 
The team have full time jobs though so can't be online all the time. The changes might force them to add more moderators so there's more chance of at least a couple of team members being online at any time.
Is there a requirement for how long a post should be up for though, realistically even the biggest moderation team on a reasonably sized fourn posts could be up for an hour or so before a moderator sees it.
The key question will be what is a reasonable amount of time for a post to be up acording to the law, even a large moderation team on a small fourm could take 10 - 20 mins before a post is removed as they may have to read through the entire conversation to get context.
 
As a generally left-leaning site politically, I’ll admit to being surprised that people on here (seemingly) oppose the Online Safety Bill. The left-wing parties mostly seem all for it, and if anything, they think it doesn’t go far enough in its authoritarianism.

I was concerned about coming across too against it on here, as I didn’t want to be misconstrued as “right wing” and conflated with the likes of Elon Musk who always champion “anti-woke” and “freedom of speech”, but I’ll admit to being somewhat concerned about the wider ramifications of the Online Safety Bill, even though I 100% acknowledge the need to keep children safe online. There’s a very fine line between moderation and inhibiting online freedoms and right to privacy, in my view, and while I’m not certain that this bill will necessarily cross that line, I think it does push us increasingly close to the line.
 
I'm sure the mods have this in hand. Outside of this, can you think of a time something genuinely harmful was posted and not dealt with? I've been around these parts and the old site for 13.5 years and whilst the location and team have varied, they've always done a stellar job at keeping the forums well looked after and safe to the best of all abilities.

OFCOM can't even guarantee radio channels availability when you've prebooked them, let alone go around every single site and enforce. It'll be relying on reporting.
 
I'm sure the mods have this in hand. Outside of this, can you think of a time something genuinely harmful was posted and not dealt with? I've been around these parts and the old site for 13.5 years and whilst the location and team have varied, they've always done a stellar job at keeping the forums well looked after and safe to the best of all abilities.
This is an excellent point. I think the team strike exactly the right balance on here currently; libertarian enough to not come across overbearing, but always stepping in promptly if something is against the rules.
OFCOM can't even guarantee radio channels availability when you've prebooked them, let alone go around every single site and enforce. It'll be relying on reporting.
This may be true, but I doubt the site team would want to be actively flouting the law. The moderators would get in a lot of trouble if, god forbid, someone did report the site for whatever reason and they did not have the legally mandated safeguards in place.
 
Not sure anyone from the team had commented yet.

Have to be honest - to support a community like ours with extra costs I’d happily make a donation - but surely to keep the ethos of such community it must be just that - a voluntary donation?

@Craig ?
 
One thing folk have to remember is regulators (despite what some say) are never out to get you, if anything was flagged then they would contact the site provider and ask them to remedy it, it would only really be if you ignored that would they then enforce.

Offcom wont have the capacity to police every website anyway, they will be focusing on the high risk sites.
 
I think it goes without saying that the way this legislation is being introduced is a mess to say the least. There’s a lot of guidance yet to be published despite it being such a short time away.

What I will say for the moment, is that requests for donations are unlikely. I’ve touched on this in the past, but as soon as donations come into play there’s an expectation of a service to be provided. With that there may also be an expectation/demand of how that service is provided. Of course feedback is appreciated and often acted upon, but that should be on an equal basis regardless of whether one has donated or not. Yes, there will be plenty who are fine with that, but there will be others who come along and demand things in their own way or want priority over others. I’ve never been a fan of that.

We run this place for the benefit of the wider community. But, as others have pointed out we all have full time jobs and real life constantly getting in the way too. We don’t make money on running the place, and we have never wanted to either. Money and/or paid roles brings in a huge amount of administration work and takes away from the fun of what TowersStreet is for us not just as team members but as fellow enthusiasts. It’s a place to discuss our hobbies and get away from real life.

While it’d be great to have a wish list of how people would like TowersStreet to be off the back of these regulations, and considering the timescales this is something we really need to take time to digest and discuss as a team. What I can say for the moment is we’ll aim to keep things the same as much as we can. But, we’ll communicate any changes as early as possible :)
 
One thing folk have to remember is regulators (despite what some say) are never out to get you, if anything was flagged then they would contact the site provider and ask them to remedy it, it would only really be if you ignored that would they then enforce.

Offcom wont have the capacity to police every website anyway, they will be focusing on the high risk sites.
Ofcom have stated that they're not responsible for policing, regulating or moderating individual posts or infringing content. Their responsibility, as the regulator, is to ensure that every platform which hosts user generated content, big or small, operating in the UK, complies with the legal responsibilities and duties set out within the act.
 
Top