For Christmas I received a fascinating present from a buddy - my really own "best-selling" book.
"Tech-Splaining for Dummies" (great title) bears my name and my image on its cover, and it has glowing reviews.
Yet it was completely composed by AI, with a few easy prompts about me provided by my pal Janet.
It's an intriguing read, and very funny in parts. But it likewise meanders quite a lot, and is somewhere between a self-help book and a stream of anecdotes.
It imitates my chatty style of writing, but it's likewise a bit repeated, and extremely verbose. It might have exceeded Janet's prompts in looking at information about me.
Several sentences start "as a leading innovation reporter ..." - cringe - which might have been scraped from an online bio.
There's also a strange, repetitive hallucination in the kind of my feline (I have no family pets). And there's a metaphor on nearly every page - some more random than others.
There are lots of companies online offering AI-book writing . My book was from BookByAnyone.
When I called the president Adir Mashiach, based in Israel, he informed me he had sold around 150,000 personalised books, mainly in the US, given that rotating from assembling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller expenses ₤ 26. The company utilizes its own AI tools to create them, based on an open source big language design.
I'm not asking you to purchase my book. Actually you can't - just Janet, who developed it, can purchase any additional copies.
There is currently no barrier to anybody creating one in any person's name, consisting of celebrities - although Mr Mashiach states there are guardrails around violent content. Each book contains a printed disclaimer stating that it is fictional, developed by AI, wiki.vst.hs-furtwangen.de and created "entirely to bring humour and joy".
Legally, the copyright belongs to the company, however Mr Mashiach stresses that the item is meant as a "customised gag present", and the books do not get sold even more.
He hopes to broaden his variety, generating various categories such as sci-fi, and perhaps using an autobiography service. It's developed to be a light-hearted type of customer AI - offering AI-generated products to human customers.
It's likewise a bit frightening if, like me, you write for a living. Not least since it probably took less than a minute to produce, and it does, certainly in some parts, sound much like me.
Musicians, authors, artists and actors worldwide have actually expressed alarm about their work being utilized to train generative AI tools that then churn out similar content based upon it.
"We must be clear, when we are talking about information here, we actually indicate human creators' life works," states Ed Newton Rex, founder of Fairly Trained, which projects for AI firms to respect creators' rights.
"This is books, this is posts, this is images. It's artworks. It's records ... The entire point of AI training is to learn how to do something and then do more like that."
In 2023 a song including AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms since it was not their work and they had not consented to it. It didn't stop the track's creator attempting to nominate it for a Grammy award. And although the artists were phony, it was still hugely popular.
"I do not believe making use of generative AI for creative purposes ought to be banned, but I do think that generative AI for these functions that is trained on individuals's work without approval ought to be banned," Mr Newton Rex adds. "AI can be extremely effective however let's build it fairly and relatively."
OpenAI says Chinese competitors utilizing its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and dents America's swagger
In the UK some organisations - consisting of the BBC - have picked to block AI designers from trawling their online material for training purposes. Others have decided to work together - the Financial Times has partnered with ChatGPT creator OpenAI for example.
The UK government is thinking about an overhaul of the law that would allow AI developers to use creators' content on the internet to help establish their models, unless the rights holders pull out.
Ed Newton Rex explains this as "insanity".
He explains that AI can make advances in areas like defence, health care and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and altering copyright law and messing up the incomes of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in your home of Lords, is likewise strongly versus eliminating copyright law for AI.
"Creative markets are wealth developers, 2.4 million tasks and a lot of pleasure," says the Baroness, who is also a consultant to the Institute for Ethics in AI at Oxford University.
"The government is weakening one of its finest carrying out markets on the unclear guarantee of development."
A federal government spokesperson stated: "No move will be made till we are absolutely confident we have a practical plan that provides each of our goals: increased control for right holders to assist them certify their material, access to premium material to train leading AI models in the UK, and more transparency for best holders from AI developers."
Under the UK federal government's brand-new AI strategy, a nationwide data library including public information from a vast array of sources will also be made available to AI scientists.
In the US the future of federal guidelines to manage AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to boost the safety of AI with, among other things, companies in the sector needed to share details of the workings of their systems with the US government before they are launched.
But this has now been repealed by Trump. It remains to be seen what Trump will do instead, however he is said to desire the AI sector to face less guideline.
This comes as a variety of lawsuits against AI companies, and bytes-the-dust.com especially versus OpenAI, continue in the US. They have actually been secured by everybody from the New York Times to authors, music labels, and even a comic.
They declare that the AI firms broke the law when they took their content from the web without their approval, and used it to train their systems.
The AI business argue that their actions fall under "fair use" and are therefore exempt. There are a variety of factors which can make up fair usage - it's not a straight-forward meaning. But the AI sector is under increasing scrutiny over how it collects training data and whether it ought to be paying for it.
If this wasn't all enough to contemplate, Chinese AI firm DeepSeek has shaken the sector over the previous week. It became the many downloaded free app on Apple's US App Store.
DeepSeek claims that it developed its technology for a fraction of the cost of the likes of OpenAI. Its success has raised security concerns in the US, and threatens American's current dominance of the sector.
As for me and a career as an author, I believe that at the minute, if I really want a "bestseller" I'll still need to compose it myself. If anything, Tech-Splaining for Dummies highlights the existing weak point in generative AI tools for larger jobs. It has lots of inaccuracies and hallucinations, and it can be quite challenging to check out in parts since it's so verbose.
But provided how quickly the tech is progressing, I'm uncertain how long I can stay confident that my substantially slower human writing and modifying skills, are better.
Register for our Tech Decoded newsletter to follow the biggest developments in global technology, with analysis from BBC correspondents around the world.
Outside the UK? Register here.
1
How an AI-written Book Shows why the Tech 'Terrifies' Creatives
Gavin Rawls edited this page 2025-02-02 14:30:25 +00:00