• ℹ️ Heads up...

    This is a popular topic that is fast moving Guest - before posting, please ensure that you check out the first post in the topic for a quick reminder of guidelines, and importantly a summary of the known facts and information so far. Thanks.

Blackpool Pleasure Beach: 2026 Discussion

The dead giveaway is that if you follow the snakes body, there’s a phantom “loop” of the body that isn’t attached to the head or tail. Would never have happened if an artist created the original design. The sculpted piece was made from an AI design
Nothing wrong with how it looks despite it being AI, if it was used to aid in the design then so be it
 
Its just lazy - not the worse use of AI I’ve ever seen but it’s really showing how low the standard is falling

Theme parks are an inherently creative industry, if we lose those talents there’s no going back
Not lazy helping to be more efficient and less time consuming. AI is a tool to make tasks easier
 
I don't disagree, but at the end of the day it is here to stay and will be used, so will just have to put up with it I suppose. Nothing going to stop them using it
You say put up with it though until parks are exclusively using this lazy, damaging, horrid way instead of employing creatives to work for them, this is the path this gen AI nonsense leads to
 
Nothing wrong with how it looks despite it being AI, if it was used to aid in the design then so be it
AI is stolen art; it generates images/content through stealing from creatives. It also steals jobs, and is terrible for the environment. Plus the human cost of removing our ability to think and create for ourselves - possibly the worst thing about it.

No theme park, or business, should be using AI for their marketing. It's incredibly unethical and has, in this case, led to a graphic that doesn't make sense making it onto a final product.
 
AI is stolen art; it generates images/content through stealing from creatives. It also steals jobs, and is terrible for the environment. Plus the human cost of removing our ability to think and create for ourselves - possibly the worst thing about it.

No theme park, or business, should be using AI for their marketing. It's incredibly unethical and has, in this case, led to a graphic that doesn't make sense making it onto a final product.
AI is a tool nothing more, nothing less and should be used as such like a computer. Its simply next stage of technolgy.
 
I completely understand the strength of feeling here and I sympathise with the fear that human creativity is being eroded. However, I think we need to be careful about drawing such hard lines in the sand regarding what is "unethical" or "stolen", as the reality is far more nuanced.

As humans (and geese), we're all "trained" on copyrighted material from the moment we are born. Every artist, writer and designer is the sum of the books they have read, the films they have watched and the art they have consumed. When a human creative takes inspiration from existing works to create something new, we call it "influence" or "homage". When an algorithm does it mathematically, we call it "theft".

Alton Towers took the melody of "The Teddy Bears' Picnic" (a copyrighted work until relatively recently), twisted the tempo, changed the key and used it as the foundation for Th13teen... but they also took the lyrics by Jimmy Kennedy. As he died in 1984, they are still very much under copyright in the UK until 2054. Yet, the ride's entire marketing campaign and thematic identity relied on the phrase "If you go down to the woods today...". Was that stealing? Or was it a creative reinterpretation? We generally accept it as the latter. AI is doing a similar process of pattern matching and reassembly, just at a speed and scale that makes us uncomfortable.

We've been using Artificial Intelligence in the creative industries for decades, we just called it something else.
  • Photography: Autofocus is AI. Auto-white balance is AI. Modern smartphone computational photography is entirely AI-driven.
  • Music: Autotune is AI. Quantising (fixing the timing of a drummer who dragged a beat) is automated assistance. We don't say a producer has "no soul" because they didn't manually splice the tape to fix the timing.
  • Design: Photoshop's "Content Aware Fill" and the "Magic Wand" tool have been using machine learning to predict pixels for years.
We accept these tools because they remove the grunt work and allow the creative to focus on the vision. Generative AI should be the same.

I also find the environmental argument difficult to swallow in this specific context. We're enthusiasts for an industry that is inherently destructive and energy intensive. We cheer when parks pour thousands of tonnes of concrete and erect massive steel structures for our amusement. We celebrate when Efteling runs a rollercoaster empty all night during freezing temperatures just to keep the wheels warm so it can open on time.

We justify that by saying "well, Efteling uses green energy", ignoring the fact that the renewable energy used to cycle an empty toy could have been fed back into the grid to heat homes.

Drawing a moral line at the energy consumption of a data centre generating an image, whilst standing in a park that operates Valhalla, feels a little contradictory. We are talking about a ride that consumes ~35,000 cubic feet of gas per hour to generate fire effects, whilst simultaneously circulating 100,000 gallons of water per minute and running industrial refrigeration units to drop temperatures to -20°C. It is an environmental catastrophe disguised as a dark ride. If we can justify that for our amusement, we can probably justify a GPU running for 30 seconds to make a logo.

Just a minor point on the claim that AI use is "incredibly unethical". Is it unethical for a business to use a tool to reduce costs or improve efficiency? Was it unethical when accountants started using Excel instead of ledger books? Was it unethical when animators moved from hand drawn cells to CGI? Technology always shifts the workflow. The ethics lie in how it is used, not the tool itself.

The issue with the Aviktas emblem isn't necessarily that AI was used; it is that it was used lazily and poorly.

If a concept artist used AI to generate 50 variations of a snake logo, picked the best one and then manually refined it to ensure the anatomy made sense, no one would ever know and we would likely be praising the design. The failure here, specifically the "phantom loop" that @tayspru identified, is a failure of human quality control.

A human sculptor clearly followed a flawed blueprint generated by a machine, without engaging their own critical faculties to say "hang on, where does this snake's body actually go?". That's where the "soul" is lost; not in the use of the tool, but in the abdication of responsibility for the output.

AI is here to stay. We can either shout at the clouds, or we can demand that it's used as a tool to enhance human creativity, rather than a cheap shortcut to bypass it. Blackpool Pleasure Beach seems to have chosen the latter, and that's the real shame.
AI is a tool nothing more, nothing less and should be used as such like a computer. Its simply next stage of technolgy.
That's quite the tautology bordering on the profound. Trying to use Generative AI without a computer would certainly be a challenge, unless you possess an abacus and several thousand years of free time to perform the matrix multiplications manually.

Dismissing it as "nothing more, nothing less" than a tool is dangerously reductive though. A hammer is a tool and so is a thermonuclear device. The distinction lies in the application, the scale of impact and the regulation.

Every major industrial revolution, from the steam engine to the internet, has required society to take stock, adapt and legislate. We failed the Luddites (who, contrary to popular belief, didn't hate technology... they just hated the way factory owners used it to bypass skilled labour and drive down wages without safety nets). We're at risk of making the same mistake again if we simply shrug and say "it's just the next stage of technology" without questioning how it is applied.

We need responsible use policies, robust copyright frameworks and ethical guidelines. It doesn't have to be a binary war between luddites and tech bros. We don't need the footballification of this debate where you pick a side and scream at the opposition.

It's entirely possible to marvel at the technology and simultaneously demand that it's implemented in a way which respects the human creative process, rather than just generating six fingered snakes for a theme park sign because nobody could be bothered to pay a sculptor to check the anatomy.
 
I completely understand the strength of feeling here and I sympathise with the fear that human creativity is being eroded. However, I think we need to be careful about drawing such hard lines in the sand regarding what is "unethical" or "stolen", as the reality is far more nuanced.

As humans (and geese), we're all "trained" on copyrighted material from the moment we are born. Every artist, writer and designer is the sum of the books they have read, the films they have watched and the art they have consumed. When a human creative takes inspiration from existing works to create something new, we call it "influence" or "homage". When an algorithm does it mathematically, we call it "theft".

Alton Towers took the melody of "The Teddy Bears' Picnic" (a copyrighted work until relatively recently), twisted the tempo, changed the key and used it as the foundation for Th13teen... but they also took the lyrics by Jimmy Kennedy. As he died in 1984, they are still very much under copyright in the UK until 2054. Yet, the ride's entire marketing campaign and thematic identity relied on the phrase "If you go down to the woods today...". Was that stealing? Or was it a creative reinterpretation? We generally accept it as the latter. AI is doing a similar process of pattern matching and reassembly, just at a speed and scale that makes us uncomfortable.

We've been using Artificial Intelligence in the creative industries for decades, we just called it something else.
  • Photography: Autofocus is AI. Auto-white balance is AI. Modern smartphone computational photography is entirely AI-driven.
  • Music: Autotune is AI. Quantising (fixing the timing of a drummer who dragged a beat) is automated assistance. We don't say a producer has "no soul" because they didn't manually splice the tape to fix the timing.
  • Design: Photoshop's "Content Aware Fill" and the "Magic Wand" tool have been using machine learning to predict pixels for years.
We accept these tools because they remove the grunt work and allow the creative to focus on the vision. Generative AI should be the same.

I also find the environmental argument difficult to swallow in this specific context. We're enthusiasts for an industry that is inherently destructive and energy intensive. We cheer when parks pour thousands of tonnes of concrete and erect massive steel structures for our amusement. We celebrate when Efteling runs a rollercoaster empty all night during freezing temperatures just to keep the wheels warm so it can open on time.

We justify that by saying "well, Efteling uses green energy", ignoring the fact that the renewable energy used to cycle an empty toy could have been fed back into the grid to heat homes.

Drawing a moral line at the energy consumption of a data centre generating an image, whilst standing in a park that operates Valhalla, feels a little contradictory. We are talking about a ride that consumes ~35,000 cubic feet of gas per hour to generate fire effects, whilst simultaneously circulating 100,000 gallons of water per minute and running industrial refrigeration units to drop temperatures to -20°C. It is an environmental catastrophe disguised as a dark ride. If we can justify that for our amusement, we can probably justify a GPU running for 30 seconds to make a logo.

Just a minor point on the claim that AI use is "incredibly unethical". Is it unethical for a business to use a tool to reduce costs or improve efficiency? Was it unethical when accountants started using Excel instead of ledger books? Was it unethical when animators moved from hand drawn cells to CGI? Technology always shifts the workflow. The ethics lie in how it is used, not the tool itself.

The issue with the Aviktas emblem isn't necessarily that AI was used; it is that it was used lazily and poorly.

If a concept artist used AI to generate 50 variations of a snake logo, picked the best one and then manually refined it to ensure the anatomy made sense, no one would ever know and we would likely be praising the design. The failure here, specifically the "phantom loop" that @tayspru identified, is a failure of human quality control.

A human sculptor clearly followed a flawed blueprint generated by a machine, without engaging their own critical faculties to say "hang on, where does this snake's body actually go?". That's where the "soul" is lost; not in the use of the tool, but in the abdication of responsibility for the output.

AI is here to stay. We can either shout at the clouds, or we can demand that it's used as a tool to enhance human creativity, rather than a cheap shortcut to bypass it. Blackpool Pleasure Beach seems to have chosen the latter, and that's the real shame.

That's quite the tautology bordering on the profound. Trying to use Generative AI without a computer would certainly be a challenge, unless you possess an abacus and several thousand years of free time to perform the matrix multiplications manually.

Dismissing it as "nothing more, nothing less" than a tool is dangerously reductive though. A hammer is a tool and so is a thermonuclear device. The distinction lies in the application, the scale of impact and the regulation.

Every major industrial revolution, from the steam engine to the internet, has required society to take stock, adapt and legislate. We failed the Luddites (who, contrary to popular belief, didn't hate technology... they just hated the way factory owners used it to bypass skilled labour and drive down wages without safety nets). We're at risk of making the same mistake again if we simply shrug and say "it's just the next stage of technology" without questioning how it is applied.

We need responsible use policies, robust copyright frameworks and ethical guidelines. It doesn't have to be a binary war between luddites and tech bros. We don't need the footballification of this debate where you pick a side and scream at the opposition.

It's entirely possible to marvel at the technology and simultaneously demand that it's implemented in a way which respects the human creative process, rather than just generating six fingered snakes for a theme park sign because nobody could be bothered to pay a sculptor to check the anatomy.
Because it literally is theft.

A writer or artist taking inspiration from other works is a normal part of the creative process - the whole idea with art (written, visual or audio) is that it has come from somebody's brain - and has been crafted by their own inspirations, experiences and lives. Using your own inspirations and creativity to craft something original of your own. That's the whole appeal.

A computer generating art or text is not a piece of art - it has not been created by a person. It has not been shaped by any real, human experiences.

I agree regarding the environmental impact of theme parks, rides, Valhalla etc - parks should be looking at greener methods of sourcing energy e.g. solar similar to Paultons Park (who now power the entirety of Tornado Springs with solar).

I don't see why we should ignore the environmental impact of AI because other things are bad for the environment too - if an overall reduction in damage to the planet is the goal, then that can't be achieved if every individual damaging cause is ignored "because this other thing is worse."
 
Because it literally is theft.

A writer or artist taking inspiration from other works is a normal part of the creative process - the whole idea with art (written, visual or audio) is that it has come from somebody's brain - and has been crafted by their own inspirations, experiences and lives. Using your own inspirations and creativity to craft something original of your own. That's the whole appeal.

A computer generating art or text is not a piece of art - it has not been created by a person. It has not been shaped by any real, human experiences.
I think it's important to separate the emotional reaction to the technology from the practical reality of how it functions.

"A computer generating art is not created by a person" ignores the fundamental role of the human operator. An AI model doesn't wake up in the morning and decide to paint a picture of a snake. It sits dormant until a human being provides the creative spark, the direction and the intent.

As someone who has spent the last few weeks "vibe coding" a passion project, I can assure you that getting a usable result is rarely a case of typing "make me a cool app" and hitting enter. It requires specific, iterative and skilled prompting to guide the tool towards a vision that exists in the goose's mind. It's a back and forth collaboration. Is a photograph "not art" because a machine captured the light and processed the physics, rather than a human hand painting it? The photographer chooses the subject, the framing, and the lighting. The prompter chooses the descriptors, the style and the context.

The "theft" aspect is a legal and philosophical minefield. If I read every book by Dickens and then write a novel in a Dickensian style, have I stolen from him? Or have I learned from him? The AI "learns" patterns, structures and relationships between pixels or words. It doesn't cut and paste; it synthesises. We call it "inspiration" when a biological neural network (a brain) does it, but "theft" when a digital neural network does it.
I agree regarding the environmental impact of theme parks, rides, Valhalla etc - parks should be looking at greener methods of sourcing energy e.g. solar similar to Paultons Park (who now power the entirety of Tornado Springs with solar).

I don't see why we should ignore the environmental impact of AI because other things are bad for the environment too - if an overall reduction in damage to the planet is the goal, then that can't be achieved if every individual damaging cause is ignored "because this other thing is worse."
I completely and wholeheartedly agree that we shouldn't ignore the impact of data centres, just because theme parks are also wasteful. We should demand better from both.

There is nuance here too, however. Not all AI models are born equal. I personally avoid models from companies like X or OpenAI, who seem to have a cavalier attitude towards their carbon footprint. There are providers who are striving for net zero operations. We should be insisting on ethical AI consumption just as we insist on ethical coffee beans.

An interesting angle would also be taking look at the displaced carbon cost. If generating an image via AI prevents a physical photoshoot (which would involve driving crew and equipment to a location, catering, manufacturing physical props and the associated logistics) is the AI generation actually the greener option? Digital energy consumption vs physical resource consumption is a complex equation.

Personally, as a bird who doesn't drive, I like to think my lack of vehicular emissions buys me enough carbon credit to use agentic AI without melting the ice caps.

I believe that we need balance, regulation, and ethical sourcing, not a blanket rejection of the tool itself.
 
Top