AI is shaping how products and services are delivered - and how people find, understand, and act on information. Like many other content design teams, we’ve been thinking about and experimenting with AI. We’ve been talking about the role content designers play in areas like effective prompt design, and how we can use technologies like large language models (LLMs) to help us in our roles – helping us generate first drafts, editing content consistently, and speeding up desk research and content audits.
But as well as looking at what AI can do for us, we’ve also been looking at what we can do with, or for, AI as part of our work on digital products and services.
You still need a content designer for AI supported services
AI supported services – such as the chatbots or AI agents – are often seen as more ‘technical’ projects, which need developer input more than they need design or content expertise - with an assumption that the large language model can ‘do the words’.
But the content within these services still needs to be accessible and accurate.
Not only can a content designer write accessible content in plain English for an AI supported service, we can also design the logic, structure and governance that makes content safe and usable, both for users and AI.
There are lots of different contexts and scenarios in which content designers might work on services that use AI.
I’ve chosen three examples to focus on - each at a different level of designing a service.
Working at the interface level - designing the parts of services powered by AI
In cases such as chatbots or AI supported search functions, AI isn’t just in the background, it makes up the whole user interface.
In these cases, content designers can:
-
write and test prompts that reflect real user needs
-
create ‘metaprompts’ – prompts that guide a large language model to generate more precise or personalised responses
-
design ‘error messages’ for when the AI can’t help or shouldn’t answer
-
structure conversation flows that are useable, accessible, and safe
-
work closely with policy and technical teams to align rules behind responses
Working at the service level - managing the risks of AI-generated content
When AI tools give users incorrect, confusing, or potentially harmful information, such as hallucinated answers or biased assumptions, the root cause is often the underlying content they’re working from.
When working at the service level, it's important to consider how AI – including any type of automation – could impact user journeys and experience. This includes designing for escalation routes, any potential points of failure, and instances where a user might not progress as planned towards the outcome they need.
Content designers can help:
-
identify high-risk topics that need human review
-
flag ambiguity or complexity that AI might not be able to handle
-
design escalation routes when AI can’t answer, including appropriate pathways to human support
-
test AI outputs and trace them back to source content
-
help design and test content safety services – to review the prompts and generated content for incorrect or harmful content
Working at an organisational level - preparing organisational content for AI
Where organisations are starting to experiment with and embed AI, their content and information architecture (IA) probably wasn’t designed with automation or machine reasoning in mind.
Here, content designers can lay the groundwork to make sure that when AI is part of a service, it works from content that’s usable, structured, and maintained to be used effectively by technologies like Large Language Models, while also designed to best meet the needs of users.
This might look like:
-
auditing content for gaps, duplication, or unclear or ambiguous information
-
rewriting or structuring guidance so it can be effectively retrieved by AI
-
identifying any content that shouldn’t be retrieved by AI, and designing responses that explain why a human needs to help instead
-
auditing information architecture
-
designing information governance processes to prevent or manage outdated, or duplicate content e.g. across areas like policy documents or intranet sites.
What this means for designing AI supported services
Delivering services that involve AI is a design challenge like any other, requiring care in how technology is integrated into how organisations work and how people access the information and support that they need. It demands the same user focus that designers bring to all the work we do, with the added responsibility of making sure the AI communicates clearly and accurately.
As AI becomes integrated into more services, content designers must help shape not just what these tools say, but how they say it, when they say it, and why. That means structuring content to be usable by both users and machines, designing for edge cases and errors, and embedding content governance.
The teams who design the best AI-supported services will be the ones who understand that good content design creates good service infrastructure to best meet user needs.
Our recent design blog posts
Transformation is for everyone. We love sharing our thoughts, approaches, learning and research all gained from the work we do.
-
How we’re reintroducing content crits
Read blog post -
Transforming further education with user-centred design
Read blog post -
-
Our time at Canoe Norway
Read blog post