
In the history of human civilization, there have been several distinct ages: the Agricultural Age, the Industrial Age, and the Information Age, which we are living in now.
Within each age, there are different eras, each marked by a drastic drop in the cost of a fundamental “atomic unit.” These cost collapses triggered enormous increases in demand and reshaped society by changing human behaviour at scale.
From the late 1970s to the 1990s, the invention of the personal computer drastically reduced the cost of computing [1]. A typical CPU in the early 1980s cost hundreds of dollars and ran at just a few MHz. By the 1990s, processors were orders of magnitude faster for roughly the same price, unlocking entirely new possibilities like spreadsheets and graphical user interfaces (GUIs).
Then, from the mid-1990s to the 2010s, came the next wave: the Internet. It brought a dramatic drop in the cost of connectivity [2]. Bandwidth, once prohibitively expensive, fell by several orders of magnitude — from over $1,200 per Mbps per month in the ’90s to less than a penny today. This enabled browsers, smartphones, social networks, e-commerce, and much of the modern digital economy.
From the mid-2010s to today, we’ve entered the era of AI. This wave has rapidly reduced the cost of intelligence [3]. Just two years ago, generating a million tokens using large language models cost over $100. Today, it’s under $1. This massive drop has enabled applications like facial recognition in photo apps, (mostly) self-driving cars, and — most notably — ChatGPT.
These three eras share more than just timing. They follow a strikingly similar pattern:
First, each era is defined by a core capability, i.e. computing, connectivity, and intelligence respectively.
Second, each unfolds in two waves:
- The initial wave brings a seemingly obvious application (though often only apparent in hindsight), such as spreadsheets, browsers, or facial recognition.
- Then, typically a decade or so later, a magical invention emerges — one that radically expands access and shifts behaviour at scale. Think GUI (so we no longer needed to use a command line), the iPhone (leapfrogging flip phones), and now, ChatGPT.
Why does this pattern matter?
Because the second-wave inventions are the ones that lower the barrier to entry, democratize access, and reshape large-scale behaviour. The first wave opens the door; the second wave throws it wide open. It’s the amplifier that delivers exponential adoption.
We’ve seen this movie before. Twice already, over the past 50 years.
The cost of computing dropped, and it transformed business, productivity, and software.
Then the cost of connectivity dropped, and it revolutionized how people communicate, consume, and buy.
Now the cost of intelligence is collapsing, and the effects are unfolding even faster.
Each wave builds on the last. The Internet era was evolving faster than the PC era because the former leveraged the latter’s computing infrastructure. AI is moving even faster because it sits atop both computing and the Internet. Acceleration is not happening in isolation. It’s compounding.
If it feels like the pace of change is increasing, it’s because it is.
Just look at the numbers:
- Windows took over 2 years to reach 1 million users.
- Facebook got there in 10 months.
- ChatGPT did it in 5 days.
These aren’t just vanity metrics — they reflect the power of each era’s cost collapse to accelerate mainstream adoption.
That’s why it’s no surprise — in fact, it’s crystal clear — that the current AI platform shift is more massive than any previous technological shift. It will create massive new economic value, shift wealth away from many incumbents, and open up extraordinary investment opportunities.
That’s why the succinct version of our thesis is:
We invest in the next frontier of computing and its applications, reshaping large-scale behaviour, driven by the collapsing cost of intelligence and defensible through tech and data moats.
(Full version here).
The race is already on. We can’t wait to invest in the next great thing in this new era of intelligence.
Super exciting times ahead indeed.
P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!
Footnotes
[1] Cost of Computing
In 1981, the Intel 8088 CPU (used in the first IBM PC) had a clock speed of 4.77 MHz and cost ~$125. By 1995, the Intel Pentium processor ran at 100+ MHz and cost around $250 — a ~20x speed gain at similar cost. Today’s chips are thousands of times faster, and on a per-operation basis, exponentially cheaper.
[2] Cost of Connectivity
In 1998, bandwidth cost over $1,200 per Mbps/month. By 2015, that figure dropped below $1. As of 2024, cloud bandwidth pricing can be less than $0.01 per GB — a near 100,000x drop over 25 years.
[3] Cost of Intelligence
In 2022, generating 1 million tokens via OpenAI’s GPT-3.5 could cost $100+. In 2024, it costs under $1 using GPT-4o or Claude 3.5, with faster performance and higher accuracy — a 100x+ reduction in under two years.
This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.
Discover more from Allen's Thoughts...
Subscribe to get the latest posts sent to your email.