The Next Data Centre: Your Phone

The history of computing has been a constant shift of the centre of gravity.

When mainframe computers were invented in the middle of the last century, they were housed in air-conditioned, room-sized metal boxes that occupied thousands of square feet. People accessed these computers through dumb terminals, which were more like black and white screens and keyboards hooked to the computer through long cables. They were called dumb terminals because the smart part was all on the mainframes.

These computers worked in silos. Computer networks were very primitive. Data was mainly transferred through (physical!) punch cards and tapes.

The business model was selling hardware. During that era, giants like IBM and Wang emerged, and many subsequently submerged.

Hardware was the champion.

Mainframe computers in the 50s. Image source: Wikipedia

The PC era, which started in the 80s and supercharged in the 90s, ended the reign of the mainframe era. As computers became much faster while the price dropped by orders of magnitude, access to computing became democratized, and computers appeared on every desktop. We wanted these computers to talk to each other. Punch cards clearly no longer worked as there were millions of computers now. As a result, LANs (local area networks) were popularized by companies like Novell, which enabled the client/server architecture. Unlike the previous era, the “brains” were decentralized, with clients doing much of the heavy lifting. Servers still played a role, but for the most part, it was for centralized storage.

Although IBM invented the PCs, the business models shifted, creating the duopoly of Intel (and by association companies like Compaq) and Microsoft, with the latter capturing even more value than the former. The software era had begun.

Software became the champion. Hardware was dethroned to the runner-up.

Then, in the late 90s to the 2010s, the (broadband) web, mobile, and cloud computing came along. Connectivity became much less of an issue. Clients, especially your phones, continued to improve at a fast pace, but the capability of servers increased even faster. The “brains” shifted back to the server as that’s where the data is centralized. For the most part, clients were now responsible for user experience, important but merely a means to an end (of collecting data) rather than an end in themselves.

Initially, it appeared that the software-hardware duopoly would continue as companies like Netscape and Cisco were red hot, only to be dethroned by companies like Yahoo and AOL and later Google and Meta. Software and hardware were still crucial, but they became the enablers as the business model once again shifted.

Data became the newly crowned champion.

Fast forward to now, the latest—and arguably the greatest of all time—platform shift, powered by generative AI, is upon us. The ground beneath us is shifting again. On a per-user basis, generative AI demands orders of magnitude more energy. At a time when data centres are already consuming more energy than many countries, it is set to double again in two years to roughly equivalent to the electricity consumption of Japan. The lean startup era is gone. AI startups need to raise much more capital upfront than previous generations of startups because of the enormous cost of compute.

Expecting the server in the data centres to do all the heavy lifting can’t be sustainable in the long term for many reasons. The “brains” have once again started to shift back to the clients at the edge, and it is already happening. For instance, Tesla’s self-driving decisions are not going to make the round trip to its servers. Otherwise, the latency will make the split-second decisions a second too late. Another example, most people may not realize this, but Apple is an edge computing company already as its chips have had AI capabilities for years. Imagine how much more developers can do on your iPhone—at no cost to them—instead of paying a cloud provider to run some AI. That would be the Napster moment for AI companies!

Inevitably, now that almost every device can run some AI and is connected, things will be more decentralized.

In past eras, computing architectures evolved due to the constraints of—or the liberation of—computing capabilities, connectivity, or power consumption. The landscape has once again shifted. Like past platform shifts, there will be a new world order. The playing field will be levelled. Rules will be rewritten. Business models will be reinvented. Most excitingly, new giants will be created.

Every. Single. Time.

Seeing the future is our superpower. That’s why a while ago, at Two Small Fish Ventures, we have already revised our thesis. Now, it is all about investing in the next frontier of computing and its applications, with edge computing an important part of it. Our recent investments have been all-in on this thesis. If you are a founder of an early-stage, rule-rewriting company that is taking advantage of this massive platform shift, don’t hesitate to reach out to us. We love backing category creators in massive market opportunities.

We are all engineers, product builders and company creators. We know how things work. Let’s build the next champion together!

Update: This blog post was published just before Apple announced Apple Intelligence. I knew nothing about Apple Intelligence at that time. It was purely a coincidence. However, it did validate what I said.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Leave a comment