Last week, two things happened on opposite sides of the world that should matter to anyone who makes things for a living.

In the US, the Supreme Court quietly declined to hear a case called Thaler v. Perlmutter. Stephen Thaler, a computer scientist, had been trying for years to copyright an image his AI generated. No human involvement. Pure machine output. He listed the AI as the author. The Copyright Office said no. A district court said no. An appeals court said no. And now the Supreme Court has said: we’re not even going to discuss it.

Four days later, the UK government announced it was shelving its planned changes to AI copyright law. They’d been trying to make it easier for AI companies to train on copyrighted work. The creative industries pushed back hard enough that ministers went back to the drawing board. No legislation expected until next year at the earliest.

Here’s what caught me about both of these: the law is finally catching up to something creatives have felt in their bones for a while now.

The human part is the part that matters.

Not because the machine output is bad. It’s often pretty good. A study published in January tested AI against over 100,000 humans on creativity measures. The AI outperformed the average person. That’s not nothing. But the top 10 percent of creative humans? Still out ahead. And the gap widened the higher up the scale you went.I think about this in my own work all the time. I run a creative agency. We make films, brand work, documentaries. And yes, we use AI tools constantly. I have an AI system that helps me manage my inbox, prep for meetings, draft briefs, research. I use generative tools for concepting and pre-visualisation. I’m not anti-technology. I built half these systems myself.

But here’s what I keep coming back to: the tool doesn’t know what it’s making. It doesn’t know why.

When I’m sitting in an edit suite at 11pm, deciding whether to hold on a subject’s face for two more seconds or cut away, that decision isn’t computational. It’s instinct shaped by thousands of hours of watching, failing, rewatching, failing differently. It’s knowing that the audience needs to sit with the discomfort for a beat longer. It’s feeling the rhythm of someone’s story in your chest.

No model does that. Not yet. Maybe not ever. But definitely not yet.

I used a line at a podcast we recorded recently that I keep thinking about: “AI is just another chisel. Your craft isn’t the chisel.” I stole the Bezalel story from Exodus for that talk. First person in the Bible described as being filled with the Spirit, and it wasn’t for preaching or prophecy. It was for artistry. Metalwork, stone cutting, woodcarving. The hands mattered. The skill mattered. The filling was for the craft.

That feels relevant right now.Because the temptation for a lot of creatives is to either reject the tools entirely or surrender to them completely. Neither works. The first is nostalgia cosplaying as principle. The second is efficiency cosplaying as creativity.

The real position, the harder one, is somewhere in the middle. Use the tools. Direct them. But know what only you can bring. Know where your fingerprint goes. Know what the machine is for and what it isn’t for.

The courts seem to agree, at least for now. If no human was meaningfully involved, it’s not protectable work. It’s output, not authorship.

I find that oddly encouraging. Not because I want to gatekeep creativity. But because it means the system still recognises something that’s easy to forget when you’re surrounded by generated content: the human involvement isn’t a nice-to-have. It’s the whole point.

The chisel doesn’t get credit for the sculpture. It never did.

And for those of us who still care about making things with intention, with craft, with something at stake? That’s good news.

What’s the thing only you can bring to your work? The part no tool can replicate? I’d love to hear it. Just hit reply.

Keep Reading