Discernment by Design

Damian Cranney

Dec 2025

Dec 2025

5 mins

5 mins

5 mins

Responsible innovation in an accelerated world

Last week at the Make It Right conference in Belfast, I shared a slightly unfashionable idea.

In a moment where everyone is scrambling to implement (or define) a coherent AI strategy, the leaders I work with are feeling a different pressure underneath:

“I’m not actually sure what we should be transforming into. I just know we can’t stand still.”

The narrative is simple:

AI is here.

The world is accelerating.

If you don’t move fast, you’ll be left behind.

But in public services – and particularly in health – I see a different risk:

Not that we move too slowly.

But that we move faster than our judgement.

What leaders need most right now isn’t another AI pilot.

In my opinion, it’s discernment.


What I mean by ‘discernment’

By discernment, I mean the ability to pause (even briefly) to ask:

  • What are we really changing here?

  • Who is this actually for?

  • What might we be speeding up by mistake?

This might sound abstract, so let me ground it in two ideas.

First, Horst Rittel’s concept of ‘wicked problems’.

Rittel pointed out that not all problems are created equal. Some are relatively tidy: you can define them clearly, test options, measure what works and iterate.

But others – the ones we live with in health, social care, education – are often “wicked”:

  • There’s no single agreed definition of ‘the problem’.

  • There’s no single ‘right’ answer.

  • Every intervention has consequences for real people.

Second, Larry Tesler’s Law of Conservation of Complexity.

Tesler, the computer scientist who gave us cut-and-paste, argued that every system has a certain amount of irreducible complexity. You can’t eliminate it; you can only decide who has to deal with it.

A simple example: your iPhone camera.

When you take a photo today, it feels super simple:

  • Tap the icon

  • Press the shutter

  • Maybe drag a slider

  • Share it with thousands of people in seconds

Underneath the illusion of simplicity, your device is:

  • Capturing multiple frames, sometimes from multiple lenses

  • Running them through an imaging pipeline – Apple calls theirs the ‘photonic’ engine

  • Using AI models to decide what to sharpen, what to blur, what a “good” face looks like

The complexity of photography hasn’t gone away. It’s just been pushed down into the system.

In this case, that’s mostly a good trade-off. You and I don’t want to fiddle with ISO and white balance when we’re inspired by nature on a morning walk.

But in public services, burying complexity has different consequences. When you hide trade-offs deep in a model, you stop seeing:

  • Who gets seen first

  • Who waits longer

  • Whose needs don’t fit the system at all

That’s where discernment matters.

Not just ‘‘can we do this with AI?” but “where will the complexity live, and who will end up dealing with it?”


More than a ‘nice’ way to work, Co-design is about mitigating risk

A lot of our work at Big Motive is rooted in co-design – in a nutshell: involving people who use and deliver services in the design process.

Co-design isn’t a new fad. It grew out of ‘participatory design’, which emerged in Scandinavia in the 1960s and 70s when trade unions pushed to give workers a real voice in how new technologies were designed and deployed at work.  

The goal was simple and radical:

Don’t just design for people.

Design with them – especially when the system will reshape their daily lives.

In complex systems, I see co-design less as ‘inclusion’ and more about risk management.

Because here’s the uncomfortable truth:

If people don’t recognise their world in your solution, they will quietly not use it.

They’ll work around it.

They’ll recreate the old system in spreadsheets and side conversations.

And your big transformation becomes expensive theatre.

Which brings me to children's services in the Irish midlands


What discernment looks like when designing access to children’s services

Earlier this year, we were invited to work with teams in HSE CHO8 ( Health Service Executive's ‘Community Healthcare Organisation’ for the midlands region in Ireland) on access to children’s services.

If you talked to families and clinicians at the time, the picture was stark:

  • Referral pathways were unclear.

  • Families were bounced between Primary Care, Disability and Mental Health services.

  • Waiting times were often 23 weeks or more.

  • Each service felt like a separate world.

The complexity of the system was sitting squarely on the shoulders of:

  • Parents, telling the same story again and again.

  • Clinicians, drowning in admin and feeling powerless.

  • Admin teams, juggling incompatible processes.

This wasn’t a “what colour should the app be?” problem.

It was a wicked problem: policy, geography, workforce constraints, legacy systems – all entangled.

We could have gone straight to technology. We could have proposed an AI triage tool, a slick portal, a dashboard.

Instead, our first act of discernment was much simpler:

“What actually happens today when a child in this region needs help?”

That question changed everything.


370 people, one shared picture of reality

Together with local clinicians, managers and the HSE Spark Innovation team, we started by listening.

We mapped journeys from the point of view of families and professionals.

We captured the “official” pathways – and the informal workarounds.

We heard the “don’t write this down, but here’s what really happens” stories.

Crucially, we widened the circle.

Over time, more than 370 stakeholders were involved:

  • Parents and carers

  • Clinicians from Primary Care, Disabilities and Mental Health

  • Admin staff

  • Managers and referrers

That scale of involvement wasn’t about virtue-signalling.

It was about building a faithful picture of reality.

Because if clinicians and families don’t recognise their world in your solution, they won’t trust it. And if they don’t trust it, adoption will quietly fail.

As we listened, the brief shifted.

It evolved from:

“We need to reduce waiting times”

to something more precise:

“How might we create a Single Point of Access that makes sense to families and professionals

– and doesn’t just move the bottleneck somewhere else?”


What actually changed

At the heart of the new model is a Single Point of Access (SPA).

Instead of referrals disappearing into three separate systems, there is now one multidisciplinary front door.

Behind that front door, we co-designed:

  • A central review team, staffed from across services, who see the same information and make decisions together.

  • Shared care pathways so everyone knows who does what, in what order, based on the child’s needs.

  • A more consistent way of collecting information from families so decisions are based on richer, clearer data.

There’s no glossy consumer app.

Most of the work is about roles, conversations, governance and forms.

In other words: it’s about deciding, deliberately, where the complexity should live.

And the results?

  • Referral times have dropped from around 23 weeks to 3 weeks.

  • Duplicate referrals have been eliminated.

  • Clinicians are spending up to 60% less time on admin, and more on actual care.

  • The model is now being rolled out across the Midlands Integrated Healthcare Area and is informing emerging national policy on children’s services.

All of that happened before AI entered the picture.


So where does AI fit?

AI is genuinely powerful. It can:

  • Summarise complex information

  • Spot patterns in large datasets

  • Draft documents and options at speed

But it cannot decide:

  • Which problems are worth solving

  • Which trade-offs are acceptable

  • Who should carry which risks

In CHO8, if we had dropped AI into the old system, we might have:

  • Automated a broken referral process

  • Given biased or incomplete data a misleading aura of ‘intelligence’

  • Built dashboards that made leaders feel in control without changing reality for families

After the SPA was in place, something else became possible.

With clearer pathways, a central review team and better data, you can have a sensible conversation about how AI might:

  • Support triage decisions

  • Reduce remaining admin burden

  • Surface risks earlier

The technology didn’t change.

What changed was the quality of the underlying system and the clarity of intent.

That, to me, is the essence of designing with discernment.

AI doesn’t give us discernment.

It reveals whether we used it in the first place.


Three questions for leaders of change

If you’re in a change, transformation or innovation role – especially in health or public services – I’d offer three simple questions we used in CHO8:

  1. Are we clear whose lived experience we’re trying to change?

  2. Have the people who live and work in this system helped design the change?

  3. Are we prepared to stay with this long enough to change the system, not just the slide deck?

If the answer to those questions is yes, then AI can be a powerful ally.

If the answer is no, then no amount of clever technology will save us from ourselves.


An invitation

We’re not short of intelligence – human or artificial.

What’s scarce is discernment: the willingness to slow down just enough to see where the complexity really lives, and to choose carefully who has to carry it.

That’s what we tried to practice with children’s services in CHO8.

It’s what I spoke about at Make It Right.

And it’s the work I believe leaders in health and public services are being called to do now.

Make the future real.

Big or small, every idea starts with a conversation.