Anna had the life she’d built for herself.
At 23 she decided she was going to be an entrepreneur. Not just any entrepreneur — the kind that other women looked up to. She built a coaching business in the Netherlands targeting ambitious female founders, selling high-end one-on-one programs, ebooks, group trajectories. She believed in what she was selling. She taught her clients that anything was possible if you wanted it badly enough. That you could manifest the life you wanted. That ambition was enough.
She was good at it. For over a decade, it worked.
Then, somewhere around 2024, the high-ticket programs stopped selling. Not all at once. Gradually. A client who would have signed up for a three-month trajectory instead said she needed to think about it. Then said no. Then another. Then another. Anna started noticing a pattern. Her potential clients weren’t going to a competitor. They weren’t hiring a different coach.
They were opening ChatGPT.
She’s not wrong. And she’s not bitter about it either. What strikes me about Anna’s story is how clearly she saw what was happening before most people in her industry were willing to admit it. She didn’t wait to be displaced. She watched the floor disappear beneath her business and decided to step off before she fell.
But what she stepped into surprised even her.
The identity underneath the business
At 36, Anna found herself quietly reassessing more than just her career.
The identity she’d built — the ambitious female entrepreneur, the manifestation coach, the woman who had figured it out — was unravelling. Not because of AI alone, but because life had simply not gone the way she’d planned. The relationship she’d expected. The version of herself she’d been teaching others to become. Looking back, she told me, she’d spent her thirties selling a philosophy she was starting to seriously doubt.
As a single woman in her mid-thirties, she did what many do. She got a dog.
Specifically, she got a Belgian Malinois. For those unfamiliar: a Belgian Malinois is essentially a German shepherd that someone has given an espresso and a mission. They are working dogs bred for police and military. They are not, as Anna discovered immediately, a sensible choice for a first-time owner.
“It’s a herder on speed,” she says, laughing. “An absolute disaster as a first dog. But also — I loved training her. That’s where the idea came from.”
Work that AI cannot touch
Anna is now retraining as a dog trainer.
Not online. In person. With real dogs, real owners, real muddy fields and real animals that do not respond to positive affirmations or a well-structured PDF.
She is not worried about AI taking this from her.
There’s something quietly radical about that choice. After over a decade of building a business that existed entirely online — that could theoretically be run from anywhere, that scaled through content and programmes and digital products — she is deliberately choosing work that requires a body. Work that requires presence. Work that AI cannot replicate because it requires being there.
What the data confirms
I’ve been thinking about Anna’s story alongside the data from the Anthropic research paper that this site is built on. The researchers found that the jobs least exposed to AI displacement are precisely the ones that require physical presence, human judgment in real environments, and the kind of trust that only builds face to face. Dog trainers don’t appear in the most at-risk lists. They probably never will.
But what strikes me most isn’t the career change. It’s the identity shift underneath it.
Anna spent thirteen years selling a particular version of success to other women. Ambition. Manifestation. The belief that if you wanted something enough and worked hard enough, it would come. She was good at selling it partly because she believed it.
She doesn’t fully believe it anymore. And instead of pretending she does, she is doing something harder and more interesting: building a new life from what’s actually in front of her. A difficult dog. A new skill. Work that exists in the physical world.
She asked me whether I thought she’d made the right decision.
I told her I thought she’d made the honest one. Which is rarer.
Check your own risk
Is your field at risk?
Enter your occupation and see your AI automation risk score — based on Oxford Martin School + Anthropic Economic Index — 758 occupations.
Check my job risk →