Survey Confirms What Everyone Suspected: Nearly Half of Publishing Pros Use AI
Let's ask the uncomfortable question: how surprising is it, really, that nearly half of publishing professionals use artificial intelligence? According to a joint survey by BISG and BookNet Canada released this week, 47% of North American book workers now incorporate some AI tool into their workflow — often with the same quiet guilt as reading a novel's spoiler before you finish it.
The interesting thing is not the percentage. The interesting thing is that most of them do it despite serious copyright concerns. They know there is a problem. They use it anyway. This is not hypocrisy; it is practice. Or, to invoke Umberto Eco earlier than planned: it is being "integrated" rather than "apocalyptic." Though the integrated crowd of 2026 must live with the discomfort that their technological paradise was built, in part, on millions of books digitized without permission. For the full genealogy of that debate, Eco's Apocalyptics and Integrated — written in 1964 about television and mass culture — remains more relevant than it should be.
The survey, notably, does not ask what AI is being used for. And that is where the fascinating gap lies. Is it to fix an email at nine in the evening when there is no energy left? To generate a first draft of a synopsis that the human editor will rewrite anyway? To summarize a nine-hundred-page manuscript before deciding whether it is worth reading? AI does not have a single face in publishing — it has as many as the number of people who open it in incognito mode because they are not quite sure how to justify it.
There is a vast gap between the public debate — apocalyptic in headlines, resigned in book fair hallways — and what actually happens in offices. In public debates, AI threatens art, destroys creation, reduces culture to a text-prediction function. In offices, someone uses it to write the back-cover copy for a book that arrived two days ago with yesterday's deadline.
And copyright? The survey suggests concerns exist but do not stop use. The major Anthropic litigation — that $1.5 billion settlement still awaiting a fairness hearing — has not changed editors' daily behavior. While the legal system takes years to process what technology did in months, publishing keeps running on a mixture of pragmatism and diffuse anxiety.
Nobody has a perfect moral position on this. Neither do the people who answered the survey. That, actually, is the news: not that AI is here, but that ambivalence is the norm. And ambivalence is not ignorance. Sometimes it is the only honest answer to something that changed too fast. Are you using it yet?