We were supposed to catch up with him after Budi’s wedding in March 2025. We caught up almost a year later. I am ridiculously horrendous at these things, I know.
It was a weekday afternoon in a dimly lit restaurant. Budi and I sat next to each other across a cinema owner whose family runs theaters across Java. He has an air of calmness that he exudes all around him. So calm that I myself get nervous. But then I told myself “get yourself together, Toby. You got this”.
And I was genuinely eager to learn about his experience in the rapidly changing entertainment industry. I have been seeing more and more people (including myself) consuming AI-produced entertainment, and I figured someone running cinemas across Java — competing against the biggest player in the market — would have a sharp perspective on what’s happening.
Until he said something absolutely shocking.
His eyes lit up, almost like wanting to share a big announcement of the birth of a first son.
Guys! I am using the terminal in my computer for the first time to build an application for my business.
You can imagine this being said in the most calm yet energetic manner.
I know for a fact that this person is not technical. Furthermore, he does not at all have to concern himself with how to use AI to build applications. Yet here he was, showing us a web application he had built, hosted on Github. He built an interface that allows his company to ahem, trade secret ;)
He demo-ed it to us with the kind of pride you see in someone who has discovered something genuinely new about what they’re capable of.
He even recommended me and Budi a podcast: Naval Ravikant’s Motorcycle of the Mind. His message was simple. Use these tools. Use them aggressively. Don’t wait.
So his enthusiasm naturally left me wondering. He was someone with every reason to be defensive about what AI means for his industry. But he’s excited. This cinema owner sees AI as a revolution, while many others see them as the mother of job/industry destroyers.
Why is this? Two intelligent people can see the same tech and arrive at completely opposite emotional responses.
So I reflected deeper and remembered something I had been reading.
The narratives that we’ve been consuming
But before making sense of the optimism with AI, let’s have a sense of the fear first. The book that’s highly useful is Robert Shiller’s Narrative Economics.
Shiller needs no introduction, the Yale man is a Nobel Prize winner in economics. Let me flex a bit because I cannot help it, I also went to Yale. For a semester exchange. But hey stop judging. Pauli Murray College woop woop. Ok I am done.
The fascinating thing about the book is that it doesn’t sound like the economics we used to learn in high school. Scarcity, unlimited wants. Demand supply. GDP, Inflation. It doesn’t even look like the economics we observe on social media lately which is Game Theory.
The book looks like the spread of diseases. Yes, epidemiology.
The central argument that Shiller makes is that
economic narratives - the stories people tell each other about money, jobs, markets, the future - spread exactly like diseases.
They have contagion/recovery rates, they mutate, they go dormant for decades and they flare up again when conditions are ripe for flaring. Most importantly, they influence economic behaviour that household data struggle to explain.
The part that is most striking to me is what Shiller describes as “perennial narratives”. Stories that keep coming back across centuries. Every time they come back, the stories told from the narrative are different. But they are the same narrative. That narrative is: Machines will replace human labor.
Shiller traces it all the way back to the 1800s. The Luddites smashing power looms in 1811, the Swing Riots against threshing machines in 1830s. The depression of the 1870s where labor-saving inventions was the top of mind as a cause of mass unemployment. The 1930s, when “technological unemployment” started becoming a household term. Want to guess what was the terrifying new machine back then? Dial telephone.
What I have noticed is that the narrative has the same structure. Every time.
A new technology appears,
people fear it will destroy livelihoods on a massive scale.
The story spreads not necessarily because it is true, but because it taps into the very core of our emotions: anxiety. Every time, it would mutate just ever so slightly so that we can say this time is completely brand new. This time is different. This time humans are going extinct for real.
Well I believe AI is the latest mutation.
But what makes this particular mutation unusual, and this I believe will get relevant to how we are all feeling about AI, is something Shiller calls “constellation” effects. Narratives don’t operate alone, instead they cluster. Right now “AI will take your job” narrative is running alongside the AI bubble narrative and the AGI narrative. They stack on top of each other as one narrative makes the others more contagious.
If you want to see what a narrative constellation looks like in practice, look at what happened with Citrini Research a few weeks ago. They published a thought experiment on Substack imagining what the economy might look like in 2028 if AI displacement accelerates. I feel that what made it so contagious was not any single claim.
It was a brilliant piece that wove together:
the labor replacement narrative,
the bubble narrative,
the financial contagion narrative,
and the knowledge worker being automated narrative
into one vivid, emotionally coherent story.
As a result, markets moved quite wildly. Software stocks had a free fall. Bloomberg even linked the selloffs to the report. Let me repeat this one more time, a thought experiment on Substack shook Wall Street. Not because it contained new data. But because it bundled fears that were already circulating into a single story that hit harder than any of them could have alone.
Well folks, I suppose that is the constellation effect at work.
So if you are feeling anxious about AI without knowing clearly why, this might be it. It is not one story that affects us. It is a cluster of narratives hitting you from every direction. Social media, Dinner table, Cocktail party, Family gathering.
Now what’s the point of all this? Well it does not mean that the fear is wrong. The fear inside us is very real. But rather, the point is seeing the structure of fear more closely. Once we see the structure, we are no longer just inside the narrative, we are observing it.
Like a person watching a movie at the cinema, noticing not just the story in front of them but seeing how beautifully the actors act out a scene. How well setup the lighting is. How crafty is the script writer in playing with our emotions.
Note that I am not at all sharing this to trivialise the gravity of what is currently happening. I believe that disruption is very real and is underway. I see it in my own work every day and I imagine you see it in yours. But there is a difference between experiencing disruption and being swept up in a narrative about disruption.
Shiller helped me see that difference. And I hope you do too. Because it has now been easier for me to ask a more useful question. Instead of asking
“should I be scared or excited”
we can nudge ourselves to ask
“what’s in front of me now, and what can I do with it?”
Just like how the cinema owner has been doing just that, genuinely curious how he can use new technology to his advantage. He probably does not need to read Shiller. He seems to be asking the right questions already.
I have been learning a lot about what happens after you ask that question too. About what it actually looks like to work with AI in my line of work.
But first, that would involve Claude and an 80 year old man (almost).
To be continued


