When artificial intelligence flags George Orwell’s 1984 as inappropriate for students, educational technology has clearly lost the plot. A Church of England secondary school in Greater Manchester deployed AI to justify removing nearly 200 books from its library, including dystopian masterpieces and memoirs that previous generations considered essential reading.
The digital purge reads like satire you’d find trending on social media. AI flagged the following books with absurd justifications:
- 1984 for “themes of torture, violence, sexual coercion”—apparently missing the irony of censoring a book about censorship
- Stephenie Meyer’s Twilight for “mature romantic themes, sexual tension, violence involving vampires and werewolves”
- Michelle Obama’s Becoming for “racism and political themes”
- Nicholas Sparks’ The Notebook for “romantic drama about enduring love and memory loss”
The real tragedy isn’t the books—it’s what happened to the librarian who refused to comply. “Gobsmacked” by the directive to remove titles “not written for children” or posing “safeguarding risks,” she stood her ground. The school responded by launching a safeguarding investigation against her for “introducing inappropriate books”—despite previous approvals from management.
Stress forced her resignation, and the upheld complaint now bars her from future school work.
Her career destruction appalls professional organizations. Caroline Roche, chair of the School Libraries Group, called the situation “over the top” and career-ruining: “The fact it’s gone through safeguarding means [she] will never be able to work in a school again.” Index on Censorship investigators described witnessing “an unprecedented attack on the freedom to read and intellectual freedom.”
The school admits AI generated the removal justifications, though they won’t specify which tool made these algorithmic life-or-death decisions. Your children’s access to classic literature now depends on algorithms that can’t distinguish between age-appropriate themes and genuine harm. This isn’t just about one school—it’s about surrendering human judgment to machines that lack context, nuance, or understanding of educational value.
When AI starts deciding what your teenagers can read, we’ve automated away intellectual freedom itself.





























