AI HELP: I think a couch to 5k guide to write AI policies has just been published and I love it

One of the great things when something is new is that you get innovation in the most unexpected places.

For example, the Arts Council have published a new resource that can be adapted by any organisation looking to develop how they use an AI policy.

While you’re reading this your brain may saying incredulously: ‘The Arts Council? How can they help me exactly? Don’t they pay for opera?

That’s the joy of what they’ve produced. Sure, they’ve made a ‘Responsible AI at Arts Council England’ resource that shows what their AI policy is and what their workings out were. Tidy.

But what really catches my eye is the Responsible AI Practical Toolkit they’ve published alongside it. This seven step workbook acts as a breadcrumb trail from hesitant first step through to having something really robust and well thought through on your desk. Huge credit to Kate Hughes for spotting this and sharing it.

Runners have couch to 5K and I think the AI equivalent has been produced. I think you’ll like it.

What the AI Practical Toolkit does

There are seven stages to the process and each one has a handy pdf for your to download and follow.

They are:

First step. Starting the AI Conversation

I love that they have identified that pressure to come up with a policy can lead you to cut and paste someone else’s. While there is no doubt that the UK Government AI Playbook should be revered, framed and placed above the mantlepiece your organisation may need to factor in some other things.

What’s really lovely is a set of questions to ask to start the ball rolling.

Are you already using AI tools?

If so, what tools are you using?

What excites you?

What concerns you?

The secret sauce to this may well be that it looks to bring bright people along. When social media was new, these used to be called the militant optimists. Opening the door for discussion to these people can only be good. Just as good is asking the views of people who are inherently concerned about AI. 

Some national organisations have produced guidance that feels more top down than anything and I wonder if that’s the best way forward. The wisdom of the crowd can still sometimes be a positive force.

Second step. Internal stakeholder mapping 

And here, a form that asks several more questions that includes these

Who has concerns?

Who may block the project?

Who has decision making authority?

Asking who can block and who can make decisions seems so obvious but its striking how rarely this is done. A group of people agreeing in a meeting is a taxi-full of positivity when the organisation is Regimental or Division sized. Spot the issues now and you could do something with it.

Third step. Developing an AI policy 

There is a saying that perfection is the enemy of progress.

In this part of the Arts Council approach, the advice is agreeably relaxed.

When developing an AI Policy, it is important to identify its scope. Your first policy does not need to cover all potential future uses of these technologies, nor does it need to be a definitive text. 

The pressure is off.

Fourth step. Risk List 

Now, this I also love. Maybe, it should be called the ‘AI tools list’ rather than ‘Risk list’ but this is the place they suggest a Red? Don’t use, Amber? Proceed with caution and green it’s been checked and you are fine to use it.

What this does well is create a workable list that people in the organisation can turn to. A recent list of tools has 14,000 AI tools marketed at marketing people. That’s in keeping for AI being on the hype cycle and can’t be medium term sustainable.

Fifth step. Responsible AI checklist

Now the green light has been given, here’s a checklist on how to use tools in the wild. That’s a really good approach.

Fifth step: AI Pilot Project Management Template

And we’re away…

This now keeps things on track and keeps tabs on who is in charge and what they are doing.

Seventh step: Delivering a Successful AI Pilot Project 

This set of steps captures what is working and what isn’t so the rest of the organisation can learn.

Make sure managers in the business area are on board AI pilot projects are likely to encounter blockers and may attract differing opinions from colleagues. Having senior-level support from the relevant business area is vital to keeping things moving. 

Why this is good

It’s quite clear that AI will have far ranging uses and benefits. But in 2025, there is nervousness and the public sector can’t perform like it is a Californian start-up. Nor should it. It is answerable to the people it serves and they should be brought along. 

Organisations need to share their workings out with the public. So many pieces of AI guidance talk about the need for transparency. If your part of the public sector is being transparent and having open conversations it is picking the date and time of where AI will be discussed. Failure to do that is slowly and surely building a path to the phone ringing and you having an hour to debunk sometimes wild accusations.

I’ve had a great many conversations with people whose organisation has not tackled many of these steps. To my mind, it doesn’t make sense to release a tool like Copilot without there appearing to be either a policy or a discussion about one.

While the document doesn’t directly stress public transparency that the public sector needs the act of gathering information is a valuable approach. While this Arts Council document is for their sector the bright person can lean across the garden fence and take some lessons.
I deliver training to help you make sense of the changing landscape ESSENTIAL AI FOR PUBLIC SECTOR COMMS.

Creative commons credit: Roger Cornfoot ‘Jogging Along the Beach.’

Join the Conversation

1 Comment

  1. Dear Dan,

    Just completed an AI project with the Arts Council. Able to report that from the viewpoint of the AI practitioner they practice as well as preach. If you are offering training i can suggest a few tips from the viewpoint of the developer that might be relevant.

Leave a comment

Leave a ReplyCancel reply

Discover more from Dan Slee

Subscribe now to keep reading and get access to the full archive.

Continue reading

Exit mobile version