Literary agents say they have been forced to change their rules for book submissions because of the rampant use of artificial intelligence.
Several agents have updated their guidance to would-be authors, warning them that cover letters and manuscripts created using AI will be rejected.
Several literary agents, who represent writers and attempt to secure them publishing deals, told the trade publication The Bookseller that they were receiving increasing numbers of submissions created with generative AI.
• AI is global — and regulation is struggling to keep up
Antony Topping, managing director of Greene & Heaton, said that over the past year the company had seen a “change in the nature of many of our submissions”.
The agency’s website tells would-be writers that “even what might seem, to you, to be harmless and helpful AI editing tools can really flatten your writing”. It adds that submissions “must be the original work of the author: submissions originated, written or edited using AI will not be accepted. This includes the use of AI in your cover letter, synopsis and proposal, as well as the manuscript.”
Nicky Lander, an executive at the Bright Agency, said that for illustrators who were “sometimes not as self-assured with the written word”, AI could be a useful tool.
She said that this was not applicable to authors, however, adding: “AI can suppress an author’s voice, which goes against the grain of publishing the written word in its truest form.”
The Eve White Literary Agency also recently updated its guidance for would-be authors, stating that it was “proud” that all submissions to it were read by “experienced professionals without the use of AI”.
It adds: “We do not accept covering letters or writing written using generative AI.”
The rise in the number of writers using AI to try to secure an agent coincides with widespread concern over the “theft” of copyrighted material by technology companies to feed their programs.
Last year Anthropic paid £1.1 billion to settle a class action lawsuit filed by writers who claimed the AI company had stolen their work to train its large language models.
Other tech companies, including OpenAI, Meta and Microsoft, are facing similar lawsuits over alleged copyright violations.
Organisations such as the Publishers Association have — in common with the vast majority of the creative industries — been lobbying the government to try to ensure that their copyright-protected work is not used to train AI models “without permission or remuneration”.
Another literary agency, Janklow & Nesbit, said in a statement to The Bookseller that there had been “a noticeable uptick in the number of submissions received, many of which show discernible signs of AI use, whether in the material itself or the query letter”.
It said that as the technology improved, it would be “increasingly difficult to determine whether AI is being used, and this is a big challenge for our industry going forward”.