Despite fierce opposition from cultural industries, the UK government seems to be following the US lead by bowing down to the broligarchy (again) when it comes to protecting the copyrights of artists and creatives.
This is a perilous and critical junction for the cultural sector that has been described as “the UK crown jewels,” generating £125bn, or 5.7%, to the UK economy in 2022.
Stateside, Donald Trump has already fired the director of the US Copyright Office following a report produced by her department which suggested, “Not everyone agrees that further increases in data and test performance will necessarily lead to continued real-world improvements in utility”.
That statement works directly against the interests of the broligarchy and their insatiable appetite to take everything for themselves, for free, to train their AI models, including all the outputs from our creative industries. Let’s call this what it really is – it’s not “training,” it’s theft.
While the government recently published its AI Opportunities Action plan – including the rather chilling comment that AI will be “mainlined into the veins of this enterprising nation” – trade body Tech UK has commented that the details are mixed “particularly around compute timelines”. It goes on to state there are “notable gaps including semiconductor supply planning which remains critical for large-scale AI”.
To put some comparative figures around that and to see where the UK really stands in competitive terms to the US, Karen Hao in her recent book Empires of AI: Inside the reckless race for total domination lays out the costs of competition: “Microsoft alone spent more than $55bn in the fiscal year 2024, nearly a quarter of its reported revenue, to build what SemiAnalysis described as ‘the largest infrastructure buildout that humanity has ever seen.’ Google, meanwhile, said in its third 2024 quarterly earnings call that it planned to crank up its datacentre expenditure to reach around $50bn for the fiscal year. Meta said it would likely round out the fiscal year with up to $40bn in datacentre and infrastructure expansion, which it estimated would rise the following year.”
And it won’t end there in terms of costs, according to Dario Amodei, CEO of Anthropic, as told to the New York Times in relation to future financial requirements: “And so, today’s models cost in the order of $100m to train – plus or minus factor two or three. The models that are in training now and that will come out at various times later this year or early next year are closer in cost to $1bn. So that’s already happening. And then I think in 2025 and 2026, we’ll get more towards $5bn or $10bn.”
Amodei goes on to speak about governments and their ability to meaningfully play in this market: “I don’t know of too many governments doing it directly, though some, like the Saudis, are creating big funds to invest in the space. When we’re talking about the models are going to cost near to $1bn, then you imagine a year or two out from that, if you see the same increase, that would be $10-ish billion. Then is it going to be $100bn? I mean, very quickly, the financial artillery you need to create one of these is going to wall out anyone but the biggest players.”
Deep pockets
This is insanity. There is absolutely no way the UK can compete with such deep pockets despite all the rhetoric. Nor can UK universities hope to compete in the AI race given the exodus to industry among AI research facilities which, according to Hao, in the US “increased eightfold from 2004 to 2020”. And AI PhD graduates heading to corporations jumped from 21% to 70%, according to a 2023 study in science from MIT researchers. That’s almost a complete hollowing out of academia, shifting power again to Big Tech.
There is absolutely no way the UK can compete with such deep pockets despite all the rhetoric
You can clearly hear the narrative of Big Tech in Keir Starmer’s statement in the AI Opportunities Action Plan: “The AI industry needs a government that is on their side, one that won’t sit back and let opportunities slip through its fingers. And in a world of fierce competition, we cannot stand by. We must move fast and take action to win the global race”.
It’s eerily familiar to the Facebook mantra of “move fast and break things”.
But it is equally true that the creative and cultural industries need a government that is on their side – although in this instance what the cultural sector needs is a government prepared to protect and preserve its opportunities in the face of what is likely a catastrophic impact on the future careers and livelihoods of actors, musicians, artists, film-makers and content creators.
This is recognised in the recently published BFI report AI, copyright and productivity in the creative industries, which concluded: “Without robust policy intervention, generative AI will worsen many of the structural economic challenges that the British creative industries already face. We contend that the way forward is through purposeful, responsible and informed regulation that protects our creative industries and encourages responsible AI uptake”.
Current attempts through the Data (Use and Access) Bill to protect the copyright of creatives are struggling, having suffered a fourth House of Lords defeat. At the time of writing, the bill is in ping-pong, back and forth between the Lords and the Commons with the government clearly favouring allowing tech companies to steal the work of the creative sector for fear of losing some mythical AI race it can never win anyway.
Artists in the US are similarly without government champions, while the tech bros are holding sway. Karen Hao gives an example of a lobbying event in the US contrasting the access and influence of Sam Altman, CEO of Open AI, and a group of artists: “Altman was attending an exclusive dinner with 60 House members at the Capitol, feasting on an expertly prepared buffet with roast chicken. At the same time, the artists were hosting an interactive cocktail hour and trying to attract as many staffers with the best their budget could buy – wine and Chick-fil-A. It was a small but darkly comedic illustration of who commanded power and influence in the AI policy conversation and who didn’t”.
On the brink
The chief executive of UK music, Tom Kiehl, told the BBC that the government is “on the brink” of offering up the country’s music industry “as a sacrificial lamb in its efforts to cosy up to American-based tech giants”.
That sentiment is echoed by Channel 4 CEO Alex Mahon: “The creative industries account for 6% of the UK’s GVA [gross value add] and is growing 1.5 times faster than other sectors,” she said. “If we continue in a world where large language models can scrape and use that data without paying for it properly, we are in a dangerous position for the industry.”
The government needs to act like a government and flex its muscles on behalf of the most vulnerable and those most under threat, not those who stand the most to gain from their theft. If they sell the family silver that is the UK’s great cultural industries to Silicon Valley, they can’t not know what they are doing because we have seen all this theft before – we know where it ends.
It begs the question – do they have any idea who they are dealing with? Do they even have a basic grasp of the economics of AI? The government is essentially colonising its own cultural industries.
According to the government’s assessment, “The current uncertainty around intellectual property is hindering innovation and undermining our broader ambitions for AI, as well as the growth of our creative industries.” This of course is a total tech wheeze. There is absolutely zero uncertainty.
The law is clear. UK copyright law does not allow text and data mining for commercial purposes without a licence. Is it too much to expect the government to enforce its own legislation, not break it in the vain hope of currying a little favour in Silicon Valley?