You are viewing your 1 free article this month.
Sign in to make the most of your access to expert book trade coverage.
More than half (51%) of published novelists in the UK believe artificial intelligence (AI) is likely to end up entirely replacing their work, according to a new survey of 258 authors and 74 industry insiders from the University of Cambridge.
Additionally, almost 60% of authors said their work had been used to train AI large language models (LLMs) without permission or payment and more than a third (39%) believed their income had already taken a hit from generative AI, while 85% of the novelists said they expected their future income to be driven down by the technology.
The newly published research comes from Cambridge’s Minderoo Centre for Technology and Democracy (MCTD). Researcher Dr Clementine Collett surveyed 258 published novelists earlier this year, as well as 74 industry insiders – from commissioning editors to literary agents – to gauge how AI is viewed and used in the world of British fiction. Collett also conducted focus groups and interviews around the country, and co-ordinated a forum in Cambridge with novelists and publishers.
Genre authors are considered the most vulnerable to displacement by AI, according to the report, with two-thirds (66%) of all those surveyed listing romance authors as “extremely threatened”, followed closely by writers of thrillers (61%) and crime (60%).
Conversely, 80% of respondents said AI offers benefits to parts of society and a third of novelists (33%) use AI in their writing process, mainly for “non-creative” tasks such as research.
The department said: “Literary creatives feel that copyright laws have not been respected or enforced since the emergence of generative AI... Many warn of a potential loss of originality in fiction, as well as a fraying of trust between writers and readers if AI use is not disclosed. Some novelists worry that suspicions of AI use could damage their reputation.”
Some respondents predicted a dystopic two-tier market in which the human-written novel becomes a “luxury item” while mass-produced AI-generated fiction is cheap or free.
On reading the final report, author Tracy Chevalier said: “If it is cheaper to produce novels using AI (no advance or royalties to pay to authors, quicker production, retainment of copyright), publishers will almost inevitably choose to publish them. And if they are priced cheaper than ‘human made’ books, readers are likely to buy them, the way we buy machine-made jumpers rather than the more expensive hand-knitted ones.”
Collett said: “Many novelists felt uncertain there will be an appetite for complex, long-form writing in years to come.”
The 84-page report explores examples of generative AI tools currently used – such as as Sudowrite and Novelcrafter used to brainstorm and edit novels, while Qyx AI Book Creator or Squibler can be used to draft full-length books. Platforms such as Spines use AI to assist with publishing processes from cover designs to distribution.
Collett added: “The brutal irony is that the generative AI tools affecting novelists are likely trained on millions of pirated novels scraped from shadow libraries without the consent or remuneration of authors.”
Continues...
Many novelists reported lost income due to AI. Some felt the market was increasingly flooded with AI-generated books, with which they were forced to compete. Others said they found books under their name on Amazon that they hadn’t written. Some novelists also spoke of online reviews with telltale signs of AI, such as jumbled names and characters, that gave their books bad ratings and jeopardised future sales.
Collett said: “Most authors do not earn enough from novels alone and rely on income streams such as freelance copywriting or translation, which are rapidly drying up due to generative AI.”
Almost all (97%) authors were “extremely negative” about AI writing whole novels, or even short sections (87% extremely negative). The aspects novelists felt least negative about using AI for were sourcing general facts or information (30% extremely negative), with around 20% of novelists saying they used AI for this purpose.
About 8% of novelists said they used AI for editing text written without AI. However, many found editing to be a deeply creative process, and would never want AI involved. Almost half (43%) of novelists felt “extremely negative” about using AI for editing text.
The research also found widespread backlash against a “rights reservation” copyright model as proposed by the UK government last year, which would let AI firms mine text unless authors explicitly opted out. Some 83% of all respondents said this would be negative for the publishing industry, and 93% of novelists said they would “probably” or “definitely” opt out of their work being used to train AI models if an opt-out model were implemented.
The vast majority (86%) of all literary creatives preferred an ‘opt-in’ principle, in which rights-holders grant permission before AI scrapes any work and are paid accordingly. The most popular option was for AI licensing to be handled collectively by an industry body – a writers’ union or society – with half of novelists (48%) selecting this approach.
Some novelists worried that AI would disrupt the “magic” of the creative process. “Novelists, publishers and agents alike said the core purpose of the novel is to explore and convey human complexity,” said Collett. “Many spoke about increased use of AI putting this at risk, as AI cannot understand what it means to be human.”
MCDT said: “Authors fear AI may weaken the deep human connection between writers and readers at a time when reading is already at historically low levels, particularly among the next generation: only a third of UK children say they enjoy reading in their free time.
“Many writers want to see more AI-free creative writing on the school curriculum, and government-backed initiatives aimed at finding new voices from under-represented groups to counter risks of ‘homogeneity’ in fiction brought about by generative AI.”
The department also described how the research “reveals a sector-wide belief that AI could lead to ever blander, more formulaic fiction that exacerbates stereotypes, as the models regurgitate from centuries of previous text”, potentially causing a boom in experimental fiction as writers work to prove they are human, and push the artistry further than AI.
Kevin Duffy, founder of Bluemoose Books, contributed to a forum for the report. He said: “We are an AI-free publisher, and we will have a stamp on the cover [Faber announced its use of the ’human stamp’ in June]. And then [it is] up to the public to decide whether they want to buy that book or not. But let’s tell the public what AI is doing.”
Collett said: “Novelists are clearly calling for policy and regulation that forces AI companies to be transparent about training data, as this would help with the enforcement of copyright law. Copyright law must continue to be reviewed and might need reform to further protect creatives. It is only fair that writers are asked permission and paid for use of their work.”
The report, The Impact of Generative AI on the Novel, is available on the Minderoo Centre for Technology and Democracy website.