
This week, a piece of writing on LinkedIn caught my eye and like pulling on a piece of thread took me on an excursion.
Like you I’ll bet, I’m aware of AI summaries but I’ve not really devoted much time to them.
But then this post on LinkedIn caught my eye.
“Communicators used to write for journalists. Now we also have to write for machines that summarise us.”
As a former journo who has spent time as a press officer this made me stop. It made me look into what AI summaries are and how much I should actually be bothered.
What is an AI summary?
Of Course, I asked Google.
And Google gave me an AI summary of what an AI summary is.
“Google AI Overviews are AI-generated summaries at the top of search results that provide quick answers, combining information from multiple web sources using generative AI.”
So far, so good.
But all this did remind me of some BBC research on the wild inaccuracies that can be found in AI summaries.
“Largest study of its kind shows AI assistants misrepresent news content 45 per cent of the time – regardless of language or territory.”
So, if the BBC says almost half of AI summaries had at least one problem , what does that mean for public sector comms?
Pondering, I asked ChatGPT Deep Research to come up with a reading list.
Deep Research is the functionality that allows deeper research with links. A snap answer comes in seconds. Deep Research comes in maybe five or six minutes but when I’ve used it its been worth it.
Incidentally, while I used ChatGPT some models of Copilot also have Deep Research functionality on many of their models.
How widespread are AI summaries?
After waiting six minutes for the Deep Research, I started to read over the next few days about the role that AI summaries are playing.
In summer 2025, AI summaries in Google search accounted for 15 cent of all UK search results, research shows. Pew research from the US also shows the figure rises to more than 50 per cent for a search term of ten words or more.
People click less to a webpage when they see an AI summary. Data from Pew says that there’s a 40 per cent reduction in the people who click to around nine per cent of all search users.
News site The Daily Mail report a 90 per cent drop in traffic with the blame on AI summaries. People would rather glance at the summary than head to a news page with the attendant advertising.
Even UK Government’s own tests of its GovChat AI tool only shows a 90 per cent accuracy rate. Impressive. But still one in ten wrong.
Inaccurate information is an issue.
So what are we to do about writing for AI summaries?
Google themselves insist there is no special set of tricks that can help get your content into an AI summary.
I’m a communicator not an SEO expert. But as a communicator I’m always keen to reach an audience.
But we can follow Google’s own advice around search. That includes:
“Create helpful, reliable, people-first content. Use words that people would use to look for your content, and place those words in prominent locations on the page, such as the title and main heading of a page, and other descriptive locations such as alt text and link text.”
But can we summarise ourselves for AI summaries?
I went back to the original LinkedIn post that suggests this prompt so you can see what a summary would look like.
“Assume this content will be summarised by AI assistants for busy executives and decision makers. Rewrite it so the core message, positioning and credibility signals remain intact even when condensed to three sentences.”
The idea of this is that you can see how an AI summary might look.
So I did.
I used the last blog post I wrote on the type of content that works best in Facebook groups.
Using Copilot and the prompt, the summary cameback:
Here’s a rewritten version that keeps the core message, positioning and credibility signals intact even when later compressed to three sentences:
Community‑group audiences respond best to practical, hyper‑local content that solves real problems, reflects everyday life, and feels genuinely rooted in the place it serves. New research from Public Sector Comms Headspace and Altrincham HQ shows that posts offering useful information, celebrating local people, and fostering belonging consistently outperform generic announcements or corporate messaging. The findings give comms teams clear, evidence‑based guidance on what to prioritise if they want to build trust, engagement and meaningful community connection.
Broadly, this works. Corporate messages don’t work as well. But quoting Public Sector Comms Headspace and Altrincham HQ? The first is a Facebook group I’m admin of. The second is a PR agency just outside Manchester I’ve not heard of. Neither are quoted in the blog.
Hello there, AI hallucinations.
But this is one experiment so maybe I shouldn’t be too surprised.
What’s the guidance on AI summaries?
Original research and solutions to niche answers will not be so affected by AI summaries, according to SEO Journal.
Answers to simple questions will be up against AI, the piece says. That’s things like ‘how to change a bike chain,’ The answer to a more detailed and local problem is less likely to be summarised. That means specific public sector content may not be affected.
But wait, a plot twist
Researching more, I found that AI summaries can cause problems not just in outbound communications but across the organisation as a whole. Errors may creep in where AI tools are used. More than 100 councils have social workers using notetaking tools. The NHS has approved 19 note-taking tools despite concerns.
So these may see inaccurate AI summaries rebound on the organisation as a media query.
So what?
There is some good guidance on using AI in the public sector from tools like the UK Government AI Playbook. But there is nothing I could find on advice on inaccurate AI summaries.
Maybe its to the BBC research that we need to look. When they looked at inaccurate news summaries they discovered that the audience blamed the tool and the platform that was mis-quoted.
“Audiences look for signs that responsibility is being taken in practice: provenance that travels with the summary, working links back to the reporting, timestamps and update notes, and timely corrections where the summary is actually encountered. They also expect the fix to be reflected wherever the summary appears, not just in one place.”
But if text cannot always be relied upon, would video also work to communicate the right information?
Or an infographic with alt text?
Or telling the story direct on the platform?
Or good media relations?
Or swift rebuttal to inaccurate information?
Picture credit: By Newspaper box on Tavern Street by Hamish Griffin, CC BY-SA 2.0.
AI disclaimer: I used ChatGPT Deep Research to explore the subject and provide links for further reading. I used Copilot to test the prompt to summarise a blog post and Google search to give me a definition of AI summaries.
For more, I deliver training to help you make sense of the changing landscape.
ESSENTIAL AI FOR PUBLIC SECTOR COMMS










