Portfolio Highlight: Zinite. Speed and Energy, Two Birds, One Stone

For most of semiconductor history, progress was a simple loop. Shrink transistors. Fit more into the same area. Get faster compute as a byproduct.

That loop had a name. Moore’s Law. It traces back to Intel co-founder Gordon Moore. He observed in the 1960s that the number of transistors on a chip, and hence its capabilities, tended to double every two years. The industry turned that observation into a roadmap. It was never guaranteed to run forever. Now shrinking is harder because we are starting to hit many limits in physics and economics, and the cost of pushing the frontier keeps rising.

So if the curve is going to keep bending upward, the industry needs new scaling vectors beyond making everything smaller in two dimensions.

This is why Two Small Fish invested in Zinite in 2021 at the company’s inception. The thesis was simple then, and it is still simple now. Scale in the third dimension, using proprietary technology protected by patents to enable true 3D chips.

Zinite stayed deliberately stealth early on, focused on building the core and protecting it properly before saying too much. Five years after we invested, we can finally talk about it more openly.

The company is led by its CEO, Dr. Gem Shoute. Fun fact. Her breakthrough was strong enough that her professors and industry veterans (who helped create fundamental IP used in all chips since 2008) joined her as co-founders, Dr. Doug Barlage and Dr. Ken Cadien.

The Distance Tax

In a recent blog post, I used a factory analogy to explain why speed, latency, and energy are often bottlenecked by movement, not necessarily arithmetic. 

In short, systems don’t lose because they can’t do math. GPUs are already very good at that. Systems lose speed because they can’t feed the math with data fast enough. 

In many systems, moving data costs far more than doing the arithmetic. When movement is expensive, speed and energy efficiency get worse together.

AI inference exacerbates the problem because the computational characteristics of AI inference workloads put a premium on memory behaviour. In many cases, the limiting factor is not arithmetic. It is how efficiently the system can move data. Bringing memory closer to logic matters because it directly reduces that movement.

Sensing fits in the same frame as logic and memory. Sensors generate raw data at high volume. If the system’s first step is to ship raw data far away before anything useful happens, it pays in bandwidth, latency, and power. The more intelligence that can happen closer to where data is produced, the less the system wastes just transporting information.

So the distance tax is one big problem showing up in three places at once. Logic. Memory. Sensing.

Why 3D Matters for Speed and Energy

When people hear 3D chips, they think density. More transistors per area. That matters. The bigger lever is proximity. Current 3D approaches to deliver more performance per area rely on advanced packaging, which is hindered by cost and the distance tax. 

If memory can live closer to logic, the system avoids transfers that dominate both performance and power. If compute and memory can sit closer to sensing, the system avoids hauling raw streams around before doing anything intelligent.

Every avoided transfer is a double win. Speed improves because stalls go down and effective bandwidth goes up. Energy improves because fewer joules are burned moving bits instead of doing work.

That is the two birds, one stone result.

Five years after we invested, Zinite is far from just a concept. The company is doing exceptionally well, and it represents the kind of platform that can extend performance gains into the post-Moore era by reducing the distance tax, not by asking physics for more shrink, but by making data travel less.

A Day at Ontario Tech University

I spent a full day at Ontario Tech University in Oshawa a few weeks ago. It was my first time on campus, despite it being just over a 40-minute drive from Toronto, where I live. I arrived curious and left with a clearer picture of what they’re building.

Ontario Tech is still a relatively young university, just over two decades old. What’s less well known—and something I didn’t fully appreciate before the visit—is how quickly it has grown in that time, now serving around 14,000 students, and how deliberately it has established itself as a research university rather than simply a teaching-focused institution.

That research orientation shows up not just in output, but in where the university has chosen to build depth—areas that sit close to real systems and real constraints.

This came through clearly in conversations with Prof. Peter Lewis, Canada Research Chair in Trustworthy Artificial Intelligence, whose work focuses on trustworthy and ethical AI. The university has launched Canada’s first School of Ethical AI, alongside the Mindful AI Research Institute, and the work here is grounded in how AI systems behave once deployed—how humans interact with them, and how unintended consequences are identified and managed.

Energy is another area where Ontario Tech has built serious capability. The university is home to Canada’s only accredited undergraduate Nuclear Engineering program, which is ranked third in North America and designated as an IAEA Collaborating Centre. In discussions with Prof. Hossam Gaber, the emphasis was on smart energy systems, where software, sensing, and control systems are developed alongside the physical energy infrastructure they operate within.

I also spent time with Prof. Haoxiang Lang, whose work in robotics, automotive systems, and advanced mobility sits at the intersection of computation and the physical world.

That work is closely tied to the Automotive Centre of Excellence, which includes a climatic wind tunnel described as one of the largest and most sophisticated of its kind in the world. The facility enables full-scale testing under extreme environmental conditions—from arctic cold to desert heat—and supports research that needs to be validated under real operating constraints.

I can’t possibly mention all the conversations I had over the course of the day—it was a full schedule—but I also spent time with Dean Hossam Kishawy and Dr. Osman Hamid, discussing how research, entrepreneurship, and industry engagement fit together at Ontario Tech.

The day also included time at Brilliant Catalyst, the university’s innovation hub, speaking with students and founders about entrepreneurship. I had the opportunity to give a keynote on entrepreneurship, and the visit ended with the pitch competition, where I handed the cheque to the winning team—a small moment that underscored how early many technical journeys begin.

Ontario Tech may be young, but it is already operating with the structure and discipline of a mature research institution, while retaining the adaptability of a newer one.

Thank you to Sunny Chen and the Ontario Tech team for the time, access, and thoughtful conversations throughout the day.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The AI Bubble That Is Not When Everyone Is All In

At the beginning of this year, I wrote an op-ed for The Globe about what many were already calling the AI bubble. Nearly a year later, almost all of what I said remains true. The piece was always meant to be a largely evergreen, long term view rather than a knee jerk reaction.

The only difference today is that the forces I described back then have only intensified.

We are in a market where Big Tech, venture capital, private equity, and the public markets are all pouring unprecedented capital into AI. But to understand what is actually happening, and how to invest intelligently, we need to separate noise from fundamentals. Here are the five key points:

  1. Why Big Tech Is Going All In while Taking Minimal Risk.
  2. The Demand Side Is Real and Growing.
  3. Not All AI Investments Are Created Equal.
  4. Picking Winners Matters.
  5. Remember, Dot Com Was a Bubble. The Internet Was Not.

1. Why Big Tech Is Going All In while Taking Minimal Risk

The motivations of the large technology companies driving this wave are very different from those of startups and other investors.

For Big Tech, AI is existential. If they underinvest, they risk becoming the next Blockbuster. If they overinvest, they can afford the losses. In practice, they are buying trillions of dollars worth of call options, and very few players in the world can afford to do that.

The asymmetry is obvious. If I were their CEOs, I would do the same.

But being able to absorb risk does not mean they want to absorb all of it. This is why they are using creative financing structures to shift risk off their balance sheets while remaining all in. At the same time, they strengthen their ecosystems by keeping developers, enterprises, and consumers firmly inside their platforms.

This is not classical corporate investing. Their objective is not just profitability. It is long term dominance.

For everyone outside Big Tech, meaning most of us, understanding these incentives is essential. It helps you place your bets intelligently without becoming roadkill when Big Tech transfers risk into the ecosystem.

2. The Demand Side Is Real

AI usage is not slowing. It is accelerating.

The numbers do not lie. Almost every metric, including model inference, GPU utilization, developer adoption, enterprise pilot activity, and startup formation, is rising. You can validate this across numerous public datasets. Directionally, people are using AI more, not less. And unlike previous hype cycles, this wave has real usage, real dollars, and real infrastructure behind it.

Yes, there is froth. But there are also fundamentals.

3. Not All AI Investments Are Created Equal

A common mistake is treating AI investing as a single category.

It is not.

Investing in a public market, commoditized AI business is very different from investing in a frontier technology startup with a decade long horizon. The former may come with thin margins, weak moats, and hidden exposure to Big Tech’s risk shifting. The latter is where transformational returns come from if you know how to evaluate whether a company is truly world class, differentiated, and defensible.

Lumping all AI investments together is as nonsensical as treating all public stocks as the same.

4. Picking Winners Matters

In public markets, you can buy the S&P 500 and call it a day. But that index is not random. Someone selected those 500 winners for you.

In venture, picking winners matters even more. It is a power law business. Spray and pray does not work. Most startups will not survive, and only the strongest will break out, especially in an environment as competitive as today.

Thanks to AI, we are in the middle of a massive platform shift. Venture scale outcomes depend on understanding technology deeply enough to see a decade ahead and identify breakout successes before others do. Long term vision beats short term noise. Daily or quarterly fluctuations are simply noise to be ignored.

5. Dot Com Was a Bubble. The Internet Was Not.

The dot com era had dramatic overvaluation and a painful crash, but the underlying technology still reshaped the world. The problem was not the internet. It was timing, lack of infrastructure, and indiscriminate investing in ideas that were either too early or simply bad.

Looking back, the early internet lacked essential components such as high speed access, mobile connectivity, smartphones, and internet payments. Although some elements of the AI stack may still be evolving, many of the major building blocks, including commercialization, are already in place. AI does not suffer from the same foundational gaps the early internet did.

Calling this a bubble as a blanket statement misses the nuance. AI itself is not a bubble. With a decade long view, it is already reshaping almost every industry at an unprecedented pace. Corrections, consolidations, and failures are normal. The underlying technological shift is as real as the internet was in the 1990s.

There is speculation. There are frothy areas. And yet, there are many areas that are underfunded. That is where the opportunities are.

History shows that great venture funds invest through cycles. They invest in areas that will be transformative in the next decade, not the next quarter.

For us, the five areas we focus on, including Vertical AI platforms, physical AI, AI infrastructure, advanced computing hardware, and smart energy, are the critical elements of AI. Beyond being our expertise, there is another important reason why these categories matter: Bubble or not, they will thrive.

We are not investing in hype, nor in capital intensive businesses where capital is the only moat, nor in companies where technology defensibility is low. As long as we stay disciplined and visionary, and continue to back founders building a decade ahead, we will do well, bubble or not.

After all, there may be multiple macro cycles across a decade. Embrace the bubble.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: Axiomatic

Last year we invested in Axiomatic AI. Their mission is to bring verifiable and trustworthy AI into science and engineering, enabling innovation in areas where rigour and reliability are essential. At the core of this is Mission 10×30: achieving a tenfold improvement in scientific and engineering productivity by 2030.

The company was founded by top researchers and professors from MIT, the University of Toronto, and ICFO in Barcelona, bringing deep expertise in physics, computer science, and engineering.

Since our investment, the team has been heads down executing. Now they’ve shared their first public release: Axiomatic Operators.

What They’ve Released

Axiomatic Operators are MCP servers that run directly in your IDE, connecting with systems like Claude Code and Cursor. The suite includes:

  • AxEquationExplorer
  • AxModelFitter
  • AxPhotonicsPreview
  • AxDocumentParser
  • AxPlotToData
  • AxDocumentAnnotator

Why is this important?

Large Language Models (LLMs) excel at languages (as their name suggests) but struggle with logic. That’s why AI can write poetry but often has trouble with math — LLMs mainly rely on pattern matching rather than reasoning.

This is where Axiomatic steps in. Their approach combines advances in reinforcement learning, LLMs, and world models to create AI that is not just fluent but also capable of reasoning with the rigour required in science and engineering.

What’s Next

This first release marks an important step in turning their mission into practical, usable tools. In the coming weeks, the team will share more technical material — including white papers, demo videos, GitHub repositories, and case studies — while continuing to work closely with early access partners.

Find out more on GitHub, including demos, case studies, and everything else you need to make your work days less annoying and more productive: Axiomatic AI GitHub

We’re excited to see their progress. If you’re in science or engineering, we encourage you to give the Axiomatic Operators suite a try: Axiomatic AI.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Jevons Paradox: Why Efficiency Fuels Transformation

In 1865, William Stanley Jevons, an English economist, observed a curious phenomenon: as steam engines in Britain became more efficient, coal use didn’t fall — it rose. Efficiency lowered the cost of using coal, which made it more attractive, and demand surged.

That insight became known as Jevons Paradox. To put it simply:

  • Technological change increases efficiency or productivity.
  • Efficiency gains lead to lower consumer prices for goods or services.
  • The reduced price creates a substantial increase in quantity demanded (because demand is highly elastic).

Instead of shrinking resource use, efficiency often accelerates it — and with it, broader societal change.

Coal, Then Light

The paradox first appeared in coal: better engines, more coal consumed. Electricity followed a similar path. Consider lighting in Britain:

PeriodTrue price of lighting (per million lumen-hours, £2000)Change vs. startPer-capita consumption (thousand lumen-hours)Change vs. startTotal consumption (billion lumen-hours)Change vs. start
1800£8,0001.118
1900£250↓ ~30×255↑ ~230×10,500↑ ~500×
2000£2.5↓ ~3,000× (vs. 1800) / ↓ ~100× (vs. 1900)13,000↑ ~13,000× (vs. 1800) / ↑ ~50× (vs. 1900)775,000↑ ~40,000× (vs. 1800) / ↑ ~74× (vs. 1900)

Over two centuries, the price of light fell 3,000×, while per-capita use rose 13,000× and total consumption rose 40,000×. A textbook case of Jevons Paradox — efficiency driving demand to entirely new levels.

Computing: From Millions to Pennies

This pattern carried into computing:

YearCost per GigaflopNotes
1984$18.7 million (~$46M today)Early supercomputing era
2000$640 (~$956 today)Mainstream affordability
2017$0.03Virtually free compute

That’s a 99.99%+ decline. What once required national budgets is now in your pocket.

Storage mirrored the same story: by 2018, 8 TB of hard drive storage cost under $200 — about $0.019 per GB, compared to thousands per GB in the mid-20th century.

Connectivity: Falling Costs, Rising Traffic

Connectivity followed suit:

YearTypical Speed & Cost per Mbps (U.S.)Global Internet Traffic
2000Dial-up / early DSL (<1 Mbps); ~$1,200~84 PB/month
2010~5 Mbps broadband; ~$25~20,000 PB/month
2023100–940 Mbps common; ↓ ~60% since 2015 (real terms)>150,000 PB/month

(PB = petabytes)

As costs collapsed, demand exploded. Streaming, cloud services, social apps, mobile collaboration, IoT — all became possible because bandwidth was no longer scarce.

Intelligence: The New Frontier

Now the same dynamic is unfolding with intelligence:

YearCost per Million TokensNotes
2021~$60Early GPT-3 / GPT-4 era
2023~$0.40–$0.60GPT-3.5 scale models
2024< $0.10GPT-4o and peers

That’s a two-order-of-magnitude drop in just a few years. Unsurprisingly, demand is surging — AI copilots in workflows, large-scale analytics in enterprises, and everyday generative tools for individuals.

As we highlighted in our TSF Thesis 3.0, cheap intelligence doesn’t just optimize existing tasks. It reshapes behaviour at scale.

Why It Matters

The recurring pattern is clear:

  • Coal efficiency fueled the Industrial Revolution.
  • Affordable lighting built electrified cities.
  • Cheap compute and storage enabled the digital economy.
  • Low-cost bandwidth drove streaming and cloud collaboration.
  • Now cheap intelligence is reshaping how we live, work, and innovate.

As we highlighted in Thesis 3.0:

“Reflecting on the internet era… as ‘the cost of connectivity’ steadily declined, productivity and demand surged—creating a virtuous cycle of opportunities. The AI era shows remarkable parallels. AI is the first technology capable of learning, reasoning, creativity… Like connectivity in the internet era, ‘the cost of intelligence’ is now rapidly declining, while the value derived continues to surge, driving even greater demand.”

The lesson is simple: efficiency doesn’t just save costs — it reorders economies and societies. And that’s exactly what is happening now.

If you are building a deep tech early-stage startup in the next frontier of computing, we would like to hear from you. This is a generational opportunity as both traditional businesses and entirely new sectors are being reshaped. White-collar jobs and businesses, in particular, will not be the same. We would love to hear from you.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Annoucing Our Investment in FUTURi Power: The Last Dumb Box in Our Home Gets a Brain

For nearly 70 years, the home electrical panel has looked the same. Meanwhile, the home itself is transforming: solar on the roof, batteries in the garage, heat pumps, EVs in the driveway, and smart appliances and devices everywhere.

And yet, the panel? Still the same. It is the last dumb box left, and FUTURi is fixing that with deep tech.

FUTURi’s Energy Processor

FUTURi Power, founded by Dr. Martin Ordonez (UBC Professor, Kaiser Chair at UBC, and recipient of the King Charles III Coronation Medal for leadership in clean energy innovation), reimagines the panel as the Energy Processor, a programmable energy computer that finally gives the home’s electrical system a brain. It is designed as a like-for-like replacement for the traditional panel that is future-proof and intelligently measures and coordinates loads, avoids peaks, and manages energy use at the edge.

Why This Matters

Homes are no longer passive energy consumers. They are dynamic nodes in the grid. By making the panel intelligent, FUTURi enables:

  • For homeowners: Achieve a 100% electric home without costly service upgrades. A smarter, more resilient, and efficient energy ecosystem.
  • For utilities: Demand peaks flattened, demand response (DR) programs and distributed energy resources (DERs) integrated, deferring costly capital expenditures.
  • For builders and communities: Intelligent electrification helps accelerate the deployment of built infrastructure without overloading the grid.

This is why FUTURi and utilities are already collaborating on projects to evaluate how Energy Processors can strengthen the grid and benefit customers.

Our Perspective

As Dr. Martin Ordonez, Founder and CEO of FUTURi Power, puts it: “Panels used to be passive. The Energy Processor is active, safe, and software-defined. It gives homes and grids a common language.”
At TSF, Smart Energy is one of our five focus areas. Our thesis is simple: the cost of intelligence is collapsing, and the biggest opportunities lie where software and hardware come together to reshape behaviour.

FUTURi is exactly that blueprint for intelligent electrification: deep-tech power electronics plus intelligent control. That combination turns a 70-year-old box into the brain of the modern home. Dr. Ordonez and his team are globally recognized experts in electrification who are translating decades of pioneering research into transformative commercial solutions.

And this is just the beginning. There is so much more the company can do to make electricity truly intelligent. FUTURi has a bright future ahead (pun fully intended).

Five Areas Shaping the Next Frontier

The cost of intelligence is dropping at an unprecedented rate. Just as the drop in the cost of computing unlocked the PC era and the drop in the cost of connectivity enabled the internet era, falling costs today are driving explosive demand for AI adoption. That demand creates opportunity on the supply side too, in the infrastructure, energy, and technologies needed to support and scale this shift.

In our Thesis 3.0, we highlighted how this AI-driven platform shift will reshape behaviour at massive scale. But identifying the how also means knowing where to look.

Every era of technology has a set of areas where breakthroughs cluster, where infrastructure, capital, and talent converge to create the conditions for outsized returns. For the age of intelligent systems, we see five such areas, each distinct but deeply interconnected.

1. Vertical AI Platforms

After large language models, the next wave of value creation will come from Vertical AI Platforms that combine proprietary data, hard-to-replicate models, and orchestration layers designed for complex and large-scale needs.

Built on unique datasets, workflows, and algorithms that are difficult to imitate, these platforms create proprietary intelligence layers that are increasingly agentic. They can actively make decisions, initiate actions, and shape workflows. This makes them both defensible and transformative, even when part of the foundation rests on commodity models.

This shift from passive tools to active participants marks a profound change in how entire sectors operate.

2. Physical AI

The past two decades of digital transformation mostly played out behind screens. The next era brings AI into the physical world.

Physical AI spans autonomous devices, robotics, and AI-powered equipment that can perceive, act, and adapt in real environments. From warehouse automation to industrial robotics to autonomous mobility, this is where algorithms leave the lab and step into society.

We are still early in this curve. Just as industrial machinery transformed factories in the nineteenth century, Physical AI will reshape industries that rely on labour-intensive, precision-demanding, or hazardous work.

The companies that succeed will combine world-class AI models with robust hardware integration and build the trust that humans place in systems operating alongside them every day.

3. AI Infrastructure

Every transformative technology wave has required new infrastructure that is robust, reliable, and efficient. For AI, this means going beyond raw compute to ensure systems that are secure, safe, and trustworthy at scale.

We need security, safety, efficiency, and trustworthiness as first-class priorities. That means building the tools, frameworks, and protocols that make AI more energy efficient, explainable, and interoperable.

The infrastructure layer determines not only who can build AI, but who can trust it. And trust is ultimately what drives adoption.

4. Advanced Computing Hardware

Every computing revolution has been powered by a revolution in hardware. Just as the transistor enabled mainframes and the microprocessor ushered in personal computing, the next era will be defined by breakthroughs in semiconductors and specialized architectures.

From custom chips to new communication fabrics, hardware is what makes new classes of AI and computation possible, both in the cloud and on the edge. But it is not only about raw compute power. The winners will also tackle energy efficiency, latency, and connectivity, areas that become bottlenecks as models scale.

As Moore’s Law hits its limit, we are entering an age of architectural innovation with neuromorphic computing, photonics, quantum computing, and other advances. Much like the steam engine once unlocked new industries, these architectures will redefine what is computationally possible. This is deep tech meeting industrial adoption, and those who can scale it will capture immense value.

5. Smart Energy

Every technological leap has demanded a new energy paradigm. The electrification era was powered by the grid. Today, AI and computing are demanding unprecedented amounts of energy, and the grid as it exists cannot sustain this future.

This is why smart energy is not peripheral, but central. From new energy sources to intelligent distribution networks, the way we generate, store, and allocate energy is being reimagined. The idea of programmable energy, where supply and demand adapt dynamically using AI, will become as fundamental to the AI era as packet switching was to the internet.

Here, deep engineering meets societal need. Without resilient and efficient energy, AI progress stalls. With it, the future scales.

Shaping What Comes Next

The drop in the cost of intelligence is driving demand at a scale we have never seen before. That demand creates opportunity on the supply side too, in the platforms, hardware, energy, physical systems, and infrastructure that make this future possible.

The five areas — Vertical AI Platforms, Physical AI, AI Infrastructure, Advanced Computing Hardware, and Smart Energy — represent the biggest opportunities of this era. They are not isolated. They form an interconnected landscape where advances in one accelerate breakthroughs in the others.

We are domain experts in these five areas. The TSF team brings technical, product and commercialization expertise that helps founders build and scale in precisely these spaces. We are uniquely qualified to do so.

At Two Small Fish, this is the canvas for the next generation of 100x companies. We are excited to partner with the founders building in these areas globally, those who not only see the future, but are already shaping it.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Masterclass Series: The Triathlon Rule of Deep Tech Startups

A swimming world champion, a cycling champion, and a marathon champion each tried their hand at a triathlon.

None of them even came close to the podium. All were easily defeated.

Why?

Because the swimming champion could not bike, nor could he run fast.

The cycling champion did not swim well.

The marathon runner was painfully slow in the water.

The winner?

It was someone who had been humbled by the swimming champion in the pool for years, finishing second in the world championships multiple times. He was an exceptional swimmer, yes. However, he could also bike fast and run hard. Not the best in any single discipline, but strong across all three. And that is what won him the race.

The takeaway:

To win in triathlon, you need to be competitive in all three disciplines.

The winner is often world class in one of them, but they must be very good if not great at the other two.

This is the same mistake many first time deep tech founders make.

They believe that superior technology alone is enough to win.

It is not.

While technology is crucial, and in fact it is table stakes and the foundation of innovation, it must be transformed into a usable product. If it does not solve a real problem in a way people can adopt and benefit from, its brilliance is wasted.

And even if you have built world class technology and a beautifully crafted product, you are still not done. Without effective commercialization, which includes distribution, pricing, sales, positioning, and partnerships, you will not reach the users or customers who need what you have built.

I wrote more about this in The Three Phases of Building a Great Tech Company: Technology, Product, and Commercialization. Each phase demands different skills. Each must be taken seriously.

Neglecting any one of them is like trying to win a triathlon without training for the bike or the run.

Just like a triathlete must train in all three disciplines, a founder must excel across all three pillars:

  • Great and defensible technology
  • An excellent product
  • Execution on commercialization

You need all three.

That is how you win the world championship.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Computing. Then Connectivity. Then Intelligence. For Half a Century, Cost Collapses Drove Massive Adoption.

In the history of human civilization, there have been several distinct ages: the Agricultural Age, the Industrial Age, and the Information Age, which we are living in now.

Within each age, there are different eras, each marked by a drastic drop in the cost of a fundamental “atomic unit.” These cost collapses triggered enormous increases in demand and reshaped society by changing human behaviour at scale.

From the late 1970s to the 1990s, the invention of the personal computer drastically reduced the cost of computing [1]. A typical CPU in the early 1980s cost hundreds of dollars and ran at just a few MHz. By the 1990s, processors were orders of magnitude faster for roughly the same price, unlocking entirely new possibilities like spreadsheets and graphical user interfaces (GUIs).

Then, from the mid-1990s to the 2010s, came the next wave: the Internet. It brought a dramatic drop in the cost of connectivity [2]. Bandwidth, once prohibitively expensive, fell by several orders of magnitude — from over $1,200 per Mbps per month in the ’90s to less than a penny today. This enabled browsers, smartphones, social networks, e-commerce, and much of the modern digital economy.

From the mid-2010s to today, we’ve entered the era of AI. This wave has rapidly reduced the cost of intelligence [3]. Just two years ago, generating a million tokens using large language models cost over $100. Today, it’s under $1. This massive drop has enabled applications like facial recognition in photo apps, (mostly) self-driving cars, and — most notably — ChatGPT.

These three eras share more than just timing. They follow a strikingly similar pattern:

First, each era is defined by a core capability, i.e. computing, connectivity, and intelligence respectively.

Second, each unfolds in two waves:

  • The initial wave brings a seemingly obvious application (though often only apparent in hindsight), such as spreadsheets, browsers, or facial recognition.
  • Then, typically a decade or so later, a magical invention emerges — one that radically expands access and shifts behaviour at scale. Think GUI (so we no longer needed to use a command line), the iPhone (leapfrogging flip phones), and now, ChatGPT.

Why does this pattern matter?

Because the second-wave inventions are the ones that lower the barrier to entry, democratize access, and reshape large-scale behaviour. The first wave opens the door; the second wave throws it wide open. It’s the amplifier that delivers exponential adoption.

We’ve seen this movie before. Twice already, over the past 50 years.

The cost of computing dropped, and it transformed business, productivity, and software.

Then the cost of connectivity dropped, and it revolutionized how people communicate, consume, and buy.

Now the cost of intelligence is collapsing, and the effects are unfolding even faster.

Each wave builds on the last. The Internet era was evolving faster than the PC era because the former leveraged the latter’s computing infrastructure. AI is moving even faster because it sits atop both computing and the Internet. Acceleration is not happening in isolation. It’s compounding.

If it feels like the pace of change is increasing, it’s because it is.

Just look at the numbers:

  • Windows took over 2 years to reach 1 million users.
  • Facebook got there in 10 months.
  • ChatGPT did it in 5 days.

These aren’t just vanity metrics — they reflect the power of each era’s cost collapse to accelerate mainstream adoption.

That’s why it’s no surprise — in fact, it’s crystal clear — that the current AI platform shift is more massive than any previous technological shift. It will create massive new economic value, shift wealth away from many incumbents, and open up extraordinary investment opportunities.

That’s why the succinct version of our thesis is:

We invest in the next frontier of computing and its applications, reshaping large-scale behaviour, driven by the collapsing cost of intelligence and defensible through tech and data moats.

(Full version here).

The race is already on. We can’t wait to invest in the next great thing in this new era of intelligence.

Super exciting times ahead indeed.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!


Footnotes

[1] Cost of Computing

In 1981, the Intel 8088 CPU (used in the first IBM PC) had a clock speed of 4.77 MHz and cost ~$125. By 1995, the Intel Pentium processor ran at 100+ MHz and cost around $250 — a ~20x speed gain at similar cost. Today’s chips are thousands of times faster, and on a per-operation basis, exponentially cheaper.

[2] Cost of Connectivity

In 1998, bandwidth cost over $1,200 per Mbps/month. By 2015, that figure dropped below $1. As of 2024, cloud bandwidth pricing can be less than $0.01 per GB — a near 100,000x drop over 25 years.

[3] Cost of Intelligence

In 2022, generating 1 million tokens via OpenAI’s GPT-3.5 could cost $100+. In 2024, it costs under $1 using GPT-4o or Claude 3.5, with faster performance and higher accuracy — a 100x+ reduction in under two years.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Wattpad Was My Regular Season. TSF Is My Playoff Hockey

When entrepreneurs exit their companies, it is supposed to be a victory lap. But in reality, many find themselves in an unexpected emotional vacuum. More often than you might think, I hear variations of the same quiet confession:

“It should have been the best time of my life. But I felt lost after the exit. I lost my purpose.”

After running Wattpad for 15 years, I understand this all too well. It is like training for and running a marathon for over a decade, only to stop cold the day after the finish line. No more rhythm. No more momentum. No next mile.

Do I Miss Operating

Unsurprisingly, people often ask me:

“Do you like being a VC?”

“Do you miss operating?”

My honest answer is yes and yes

(but I get my fix without being a CEO — see below).

Being a founder and CEO was deeply challenging and also immensely rewarding. It is a role that demands a decade-long commitment to building one and only one thing. And while I loved my time as CEO, I did not feel the need to do it again. Once in a lifetime was enough. I have started three companies. A fourth would have felt repetitive.

What I missed most was not the title or the responsibility. It was the people. The team. The day-to-day collaboration with nearly 300 passionate employees when I stepped down. That sense of shared mission — of solving hard problems together — was what truly filled my cup.

Back in the Trenches in a Different Role

Now at Two Small Fish Ventures as an operating partner, I work with founders across our portfolio. I am no longer the operator inside the company, but I get to be their sounding board — helping them tackle some of the biggest challenges they face.

Let’s be honest: they call me especially when they believe I am the only one who can help them. Their words, not mine. And there have been plenty of those occasions.

That gives me the same hit of adrenaline I used to get from operating. At my core, I love solving hard problems. That part of me did not go away after my exit. I just found a new arena for it — and it is a perfect replacement.

A Playground for a Science Nerd

What people may not realize is that the deep tech VC job is drastically different from a “normal” VC job. As a deep tech VC, I am constantly stretched and go deep — technically, intellectually, and creatively. It forces me to stay sharp, push my boundaries, and reconnect with my roots as a curious, wide-eyed science nerd.

There is something magical about working with founders at the bleeding edge of innovation. I get to dive into breakthrough technologies, understand how they work, and figure out how to turn them into usable and scalable products. It feels like being a kid in a candy store — except the candy is semiconductors, control systems, power electronics, quantum, and other domains in the next frontier of computing.

How could I not love that?

Ironically, I had less time to indulge this curiosity when I was a CEO. Now I can geek out and help shape the future at the same time. It is a net positive to me.

You Do Not Have to Love It All

Of course, every job — including CEO and VC — has its less glamorous parts. Whether you are a founder or a VC, there will always be administrative tasks and responsibilities you would rather skip.

But I have learned not to resent them. As I often say:

“You do not need to love every task. You just need to be curious enough to find the interesting angles in anything.”

Those tasks are the cost of admission to being a deep tech VC. A small price to pay to do the work I love — supporting incredible entrepreneurs as they bring transformative ideas to life, and finding joy in doing so. And knowing what I know now, I do not think I would enjoy being a “normal” VC. I cannot speak for others, but for me, this is the only kind of venture work that truly energizes and fulfills me.

A New Season. A New Purpose.

So yes, being a VC brings me as much joy — and arguably even more fulfillment (and I am surprised that I am saying this) — than being a CEO. I feel incredibly lucky. And I am all in.

It feels like all my past experience has prepared me for what I do today. I often describe this phase of my life this way:

Wattpad was my regular season. TSF is my playoff hockey.

It is faster. It is grittier. The stakes feel higher. Not because I am building one company, but because I am helping many shape the future.

P.S. Go Oilers!!

Gensee AI

A solo musician doesn’t need a conductor. Neither does a jazz trio.

But an orchestra? That’s a different story. You need a conductor to coordinate, to make sure all the parts come together.

Same with AI agents. One or two can operate fine on their own. But in a multi-agent setup, the real bottleneck is orchestration.

Yesterday, we announced our investment in GenseeAI. That’s the layer the company is building—the conductor for AI agents, i.e. the missing intelligent optimization layer for AI agents and workflows. Their first product, Cognify, takes AI workflows built with frameworks like LangChain or DSPy and intelligently rewrites them to be 10× faster, cheaper, and more reliable. It’s a bit like “compilation” for AI. Given a high-level workflow, Cognify produces a tuned, executable version optimized for production. Their second product, currently under development, goes one step further: a serving layer that continuously optimizes AI agents and workflows at runtime. Think of it as an intelligent “virtual machine” for AI, where the execution of agents and workflows is transparently and “automagically” improved while running.

If you’re building AI systems and want to go from prototype to production with confidence, get in touch with the GenseeAI team.

Read Brandon‘s blog post here or in the following for all the details:

At Two Small Fish, we invest in founders building foundational infrastructure for the AI-native world. We believe one of the most important – yet underdeveloped – layers of this stack is orchestration: how generative AI workflows are built, optimized, and deployed at scale.

Today, building a production-grade genAI app involves far more than calling an LLM. Developers must coordinate multiple steps – prompt chains, tool integrations, memory, RAG, agents – across a fragmented and fast-moving ecosystem and a variety of models. Optimizing this complexity for quality, speed, and cost is often a manual, lengthy process that businesses must navigate before a demo can become a product.

GenseeAI is building the missing optimization layer for AI agents and workflows in an intelligent way. Their first product, Cognify, takes AI workflows built with frameworks like LangChain or DSPy and intelligently rewrites them to be faster, cheaper, and better. It’s a bit like “compilation” for AI: given a high-level workflow, Cognify produces a tuned, executable version optimized for production. 

Their second product–currently under development–goes one step further: a serving layer that continuously optimizes AI agents and workflows at runtime. Think of it as an intelligent “virtual machine” for AI: where the execution of agents and workflows is transparently and automatically improved while running.

We believe GenseeAI is a critical unlock for AI’s next phase. Much of today’s genAI development is stuck in prototype purgatory – great demos that fall apart in the real world due to cost overruns, latency, and poor reliability. Gensee helps teams move from “it works” to “it works well, and at scale.”

What drew us to Gensee was not just the elegance of the idea, but the clarity and depth of its execution. The company is led by Yiying Zhang, a UC San Diego professor with a strong track record in systems infrastructure research, and Shengqi Zhu, an engineering leader who has built and scaled AI systems at Google. Together, they bring a rare blend of academic rigor and hands-on experience in deploying large-scale infrastructure. In early benchmarks, Cognify delivered up to 10× cost reductions and 2× quality improvements – all automatically. Their roadmap – including fully automated optimization, enterprise integrations, and a registry of reusable “optimization tricks” – shows ambition to become the default runtime for generative AI.

As the AI stack matures, we believe Gensee will become a foundational layer for organizations deploying intelligent systems. It’s the kind of infrastructure that quietly powers the AI apps we’ll all use – and we’re proud to support them on that journey.
If you’re building AI systems and want to go from prototype to production with confidence, get in touch with the team at GenseeAI.

Written by Brandon

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

AI’s Real Revolution Is Just Beginning

Thank you to The Globe for publishing my op-ed about AI last week. In it, I draw parallels between the dot-com crash and the current AI boom—keeping in mind the old saying, “History doesn’t repeat itself, but it often rhymes.” The piece also explores how the atomic unit of this transformation is the ever-declining “cost of intelligence.” AI is the first technology in human history capable of learning, reasoning, creativity, cross-domain thinking, and decision-making. This fundamental shift will impact every sector, without exception, spurring the rise of new tech giants and inevitable casualties in the process. The key is knowing which land to grab!

The piece is now available below.

In the past month, everyone I spoke to has been talking about DeepSeek and Nvidia. Is Nvidia facing extinction? Have certain tech giants overspent on AI? Are we seeing a bubble about to burst, or just another public market overreaction? And what about traditional sectors, like industrials, that haven’t yet felt AI’s impact?

Let’s step back. We’ll revisit companies that soared or collapsed during the dot-com crash – and the lessons we can learn. As Mark Twain reputedly said, “History doesn’t repeat itself, but it often rhymes.”

The answer is that the reports of Nvidia’s demise are greatly exaggerated, though other companies face greater danger. At the same time, new opportunities are vast because this AI-driven shift could dwarf past tech disruptions.

Before 2000, the dot-com mania hit full speed. High-flying infrastructure players such as Global Crossing – once worth US$47-billion – provided backbone networks. Cisco delivered networking equipment, and Sun Microsystems built servers. However, amid the crash, Global Crossing went bankrupt in January, 2002. Cisco plummeted from more than US$500-billion in market cap to about $100-billion. Sun Microsystems sank from a US$200-billion market cap to under US$10-billion.

They failed or shrank for different reasons. Global Crossing needed huge investments before real revenue arrived. Cisco had decent unit economics but lost pricing power when open networking standards commoditized its gear. Sun Microsystems suffered when cheaper hardware and free, open-source software (such as Linux and Apache) undercut it, and commodity hardware plus cloud computing made its servers irrelevant.

However, these companies did not decline because they were infrastructure providers. They declined because they failed to identify the right business model before their capital ran out or were disrupted by alternatives, including open or free systems, despite having the first-mover advantage.

Meanwhile, other infrastructure players thrived. Amazon, seen mostly as an e-commerce site, earned 70 per cent of its operating profit from Amazon Web Services – hosting startups and big players such as Netflix. AWS eliminated the need to buy hardware and continually cut prices, especially in its earlier years, catalyzing a new wave of businesses and ultimately driving demand while increasing AWS’s revenue.

In hindsight, the dot-com boom was real – it simply took time for usage to catch up to the hype. By the late 2000s, mobile, social and cloud surged. Internet-native giants (Netflix, Google, etc.) grew quickly with products that truly fit the medium. Early front-runners such as Yahoo! and eBay faded. Keep in mind that Facebook was founded in 2004, well after the crash, and Apple shifted from iPods to the revolutionary iPhone in 2007, which further catalyzed the internet explosion. A first-mover advantage might not always pay off.

The first lesson we learned is that open systems disrupt and commoditize infrastructure. At that time, and we are seeing it again, an army of contributors drove open systems for free, allowing them to out-innovate proprietary solutions.

Companies that compete directly against open systems – note that Nvidia does not – are particularly vulnerable at the infrastructure layer when many open and free alternatives (such as those solely building LLMs without any applications) exist. DeepSeek, for example, was inevitable – this is how technology evolves.

Open standards, open source and other open systems dramatically lower costs, reduce barriers to AI adoption and undermine incumbents’ pricing power by offering free, high-quality alternatives. This “creative destruction” drives technological progress.

In other words, OpenAI is in a vulnerable position, as it resembles the software side of Sun Microsystems – competing with free alternatives such as Linux. It also requires significant capital to build out, yet its infrastructure is rapidly becoming commoditized, much like Global Crossing’s situation. On the other hand, Nvidia has a strong portfolio of proprietary technologies with few commoditized alternatives, making its position relatively secure. Nvidia is not the new Sun Microsystems or Cisco.

Most importantly, the disruption and commoditization of infrastructure also democratize AI innovation. Until recently, starting an AI company often required raising millions – if not tens of millions – just to get off the ground. That is already changing, as numerous fast-growing companies have started and scaled with minimal initial capital. This is leading to an explosion of innovative startups and further accelerating the flywheel.

The next lesson we learned is that the internet was the first technology in human history that was borderless, connected, ubiquitous, real-time, and free. Its atomic unit is connectivity. During its rise, “the cost of connectivity” steadily declined, while productivity gains from increased connectivity continued to expand demand. The flywheel turned faster and faster, forming a virtuous cycle.

Similarly, AI is the first technology in human history capable of learning, reasoning, creativity, cross-domain functions and decision-making. Crucially, AI’s influence is no longer confined to preprogrammed software running on computing devices; it now extends into all types of machines. Hardware and software, combined with collective learning, enable autonomous cars and other systems like robots to adapt intelligently in real time with little or no predefined instructions.

These breakthroughs are reaching sectors scarcely touched by the internet revolution, including manufacturing and energy. This goes beyond simple digitization; we are entering an era of autonomous operations and, ultimately, autonomous businesses, allowing humans to focus on higher-value tasks.

As with connectivity costs in the internet era, in this AI era, “the cost of intelligence” has been steadily declining. Meanwhile, the value derived from increased intelligence continues to grow, driving further demand – this mirrors how the internet played out and is already happening again for AI. The parallels between these two platform shifts suggest that massive economic value will be created or shifted from incumbents, opening substantial investment opportunities across early-stage ventures, growth-stage private markets and public investments.

Just as the early internet boom heavily focused on infrastructure, a significant amount of capital has been invested in enabling AI technologies. However, over time, economic value shifts from infrastructure to applications – just as it did with the internet.

This doesn’t mean there are no opportunities in AI infrastructure – far from it. Remember, more than half of Amazon’s profits come from AWS. Services, such as AWS, that provide access to AI, will continue to benefit as demand soars. Similarly, Nvidia will continue to benefit from the rising demand. However, many of today’s most-valuable companies – both public and private – are in the application layer or operate full-stack models.

Despite these advancements, this transformation won’t happen overnight, but it will likely unfold more quickly than the internet disruption – which took more than a decade – because many core technologies for rapid innovation are already in place.

AI revenues might appear modest today and don’t yet show up in the public markets. However, if we look closer, some AI-native startups are already growing at an unprecedented pace. The disruption isn’t a prediction; it’s already happening.

As Bill Gates once said, “Most people overestimate what they can achieve in one year and underestimate what they can achieve in ten years.”

The AI revolution is just beginning. The next decade will bring enormous opportunities – and a new wave of tech giants, alongside inevitable casualties.

It’s a land grab – you just need to know which land to seize!

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Investing in Fibra: Revolutionizing Women’s Health with Smart Underwear

At Two Small Fish Ventures, we love backing founders who are not only transforming user behaviour but also unlocking new and impactful value. That’s why we’re excited to announce our investment in Fibra, a pioneering company redefining wearable technology to improve women’s health. We are proud to be the lead investor in this round, and I will be joining as a board observer. 

The Vision Behind Fibra

Fibra is developing smart underwear embedded with proprietory textile-based sensors for seamless, non-invasive monitoring of previously untapped vital biomarkers. Their innovative technology provides continuous, accurate health insights—all within the comfort of everyday clothing. Learning from user data, it then provides personalized insights, helping women track, plan, and optimize their reproductive health with ease. This AI-driven approach enhances the precision and effectiveness of health monitoring, empowering users with actionable information tailored to their unique needs. 

Fibra has already collected millions of data points with its product, further strengthening its AI capabilities and improving the accuracy of its health insights. While Fibra’s initial focus is female fertility tracking, its platform has the potential to expand into broader areas of women’s health, including pregnancy detection/monitoring, menopause, detection of STDs and cervical cancer and many more, fundamentally transforming how we monitor and understand our bodies.

Perfect Founder-Market Fit

Fibra was founded by Parnian Majd, an exceptional leader in biomedical innovation. She holds a Master of Engineering in Biomedical Engineering from the University of Toronto and a Bachelor’s degree in Biomedical Engineering from TMU. Her achievements have been widely recognized, including being an EY Women in Tech Award recipient, a Rogers Women Empowerment Award finalist for Innovation, and more.

We are thrilled to support Parnian and the Fibra team as they push the boundaries of AI-driven smart textiles and health monitoring. We are entering a golden age of deep-tech innovation and software-hardware convergence—a space we are excited to champion at Two Small Fish Ventures.

Stay tuned as Fibra advances its mission to empower women through cutting-edge health technology.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Announcing Our Investment in Hepzibah AI

The Two Small Fish team is thrilled to announce our investment in Hepzibah AI, a new venture founded by Untether AI’s co-founders, serial entrepreneurs Martin Snelgrove and Raymond Chik, along with David Lynch and Taneem Ahmed. Their mission is to bring next-generation, energy-efficient AI inference technologies to market, transforming how AI compute is integrated into everything from consumer electronics to industrial systems. We are proud to be the lead investor in this round, and I will be joining as a board observer to support Hepzibah AI as they build the future of AI inference.

The Vision Behind Hepzibah AI

Hepzibah AI is built on the breakthrough energy-efficient AI inference compute architecture pioneered at Untether AI—but takes it even further. In addition to pushing performance/power harder, it can handle training loads like distillation, and it provides supercomputer-style networking on-chip. Their business model focuses on providing IP and core designs that chipmakers can incorporate into their system-on-chip designs. Rather than manufacturing AI chips themselves, Hepzibah AI will license its advanced AI inference IP for integration into a wide variety of devices and products.

Hepzibah AI’s tagline, “Extreme Full-stack AI: from models to metals,” perfectly encapsulates their vision. They are tackling AI from the highest levels of software optimization down to the most fundamental aspects of hardware architecture, ensuring that AI inference is not only more powerful but also dramatically more efficient.

Why does this matter? AI is rapidly becoming as indispensable as the CPU has been for the past few decades. Today, many modern chips, especially system-on-chip (SoC) devices, include a CPU or MCU core, and increasingly, those same chips will require AI capabilities to keep up with the growing demand for smarter, more efficient processing.

This approach allows Hepzibah AI to focus on programmability and adaptable hardware configurations, ensuring they stay ahead of the rapidly evolving AI landscape. By providing best-in-class AI inference IP, Hepzibah AI is in a prime position to capture this massive opportunity.

An Exceptional Founding Team

Martin Snelgrove and Raymond Chik are luminaries in this space—I’ve known them for decades. David Lynch and Taneem Ahmed also bring deep industry expertise, having spent years building and commercializing cutting-edge silicon and software products.

Their collective experience in this rapidly expanding, soon-to-be ubiquitous industry makes investing in Hepzibah AI a clear choice. We can’t wait to see what they accomplish next.

P.S. You may notice that the logo is a curled skunk. I’d like to highlight that the skunk’s eyes are zeros from the MNIST dataset. 🙂 

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Celebrating the Unintended but Obvious Impact of Wattpad on International Women’s Day

It’s been almost three years since I stepped aside from my role as CEO of Wattpad, yet I’m still amazed by the reactions I get when I bump into people who have been part of the Wattpad story. The impact continues to surface in unexpected and inspiring ways frequently.

Wattpad has always been a platform built on storytelling for all ages and genders. That being said, our core demographic—roughly 50% of our users—has been teenage girls. Young women have always played a pivotal role in the Wattpad community.

Next year, Wattpad will turn 20 (!)—a milestone that feels both surreal and deeply rewarding. When we started in 2006, we couldn’t have imagined the journey ahead. But one thing is certain: our early users have grown up, and many of them are now in their 20s and 30s, making their mark on the world in remarkable ways.

A perfect example: at our recent masterclass at the University of Toronto, I ran into Nour. A decade ago, she was pulling all-nighters reading on Wattpad. Today, she’s an Engineering Science student at the University of Toronto, specializing in machine intelligence. Her story is not unique. Over the years, I’ve met countless female Wattpad users who are now scientists, engineers, and entrepreneurs, building startups and pushing boundaries in STEM fields.

This is incredibly fulfilling. Many of them have told me that they looked up to Wattpad and our journey as a source of inspiration. The idea that something we built has played even a small role in shaping their ambitions is humbling.

Now, as an investor at Two Small Fish, I’m excited about the prospect of supporting these entrepreneurs in the next stage of their journey. Some of these Wattpad users will go on to build the next great startups, and it would be incredible to be part of their success, just as they were part of Wattpad’s.

On this International Women’s Day, I want to celebrate this unintended but, in hindsight, obvious outcome: a generation of young women who grew up on Wattpad are now stepping into leadership roles in tech and beyond. They are the next wave of innovators, creators, and entrepreneurs, and I can’t wait to see what they build next.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Celebrating Richard Sutton’s Turing Award

I’d like to extend my heartfelt congratulations to Richard Sutton, co-founder of Openmind Research Institute and a pioneer in Reinforcement Learning, for being honoured with the 2024 Turing Award—often described as the “Nobel Prize of Computing.” This accolade reflects his groundbreaking contributions, which have shaped modern AI across a wide spectrum of applications, from LLMs to robotics and everything in between. His influence resonates throughout classrooms, research, and everyday life worldwide.

As a self-professed science nerd, I’ve had the privilege and honour of working with him through the Openmind board. Rich co-founded Openmind alongside Randy Goebel and Joseph Modayil as a non-profit focused on conducting fundamental AI research to better understand minds. We believe that the greatest breakthroughs in AI are still ahead of us, and that basic research lays the groundwork for future commercial and technological innovations.

A core principle of Openmind—and a guiding philosophy of its co-founders—is a commitment to open research: there are no intellectual property restrictions on its work, ensuring everyone can contribute to and build upon this shared body of knowledge. Rich’s vision and dedication continue to inspire researchers and practitioners around the world to push the boundaries of AI and openly share their insights. This Turing Award is a well-deserved recognition of his transformative impact, and I can’t wait to see the breakthroughs that lie ahead as his work continues to redefine our understanding of intelligence.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Only Optionality Can Make Canada Strong and Free

The tariffs are coming. We all know this isn’t really about fentanyl—only 19 kg of the U.S.’s supply comes from Canada, while close to 10,000 kg was seized at the U.S. border.

Even if we solved this tiny issue, Trump would find something else—maybe he’d complain that the snow in NYC is due to cold air from Canada and slap us with another tariff.

Trump’s playbook is simple: weaponize everything at his disposal to get what he wants.

He’s imposing tariffs on everything from us. We can debate whether to slap tariffs on orange juice or hair dryers in response, but that won’t materially change the outcome. How we react now is just noise—he holds all the leverage anyway. Canada will suffer in the short term, no matter what.

But we shouldn’t let a crisis go to waste. This is a golden opportunity to fix systemic issues that were previously near impossible to address—like interprovincial trade barriers. Yet even fixing that won’t solve the root problem.

Stepping back, the real issue is one of the first principles of leadership: Optionality.

Having alternatives always provides leverage. This principle applies broadly—not just to negotiations, but also to fundraising, supplier relationships, operations, company survival, M&A, and beyond—including leading a country.

Trump understands leverage better than most. This isn’t just about negotiation—even if we reach a deal this time, any agreement with him isn’t worth the paper it’s written on.

As a country, we are far too dependent on the U.S., and Trump knows it. Only by addressing our lack of optionality can we deal with him—and future U.S. presidents—on equal footing.

There is no quick fix. Only a new, decisive, visionary Prime Minister can guide Canada out of this mess.

The only way forward is to leverage what we do best—energy, natural resources, AI, and more—to create true optionality. As the world shifts toward intangible assets, ironically, our proximity to the U.S. is becoming less of a hindrance to diversification.

We must control our own destiny. We cannot allow any single country—U.S. or otherwise—to hold us hostage.

Only optionality can make Canada strong and free.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

AI Has Democratized Everything

This is the picture I used to open our 2024 AGM a few months ago. It highlights how drastically the landscape has changed in just the past couple of years. I told a similar story to our LPs during the 2023 AGM, but now, the pace of change has accelerated even further, and the disruption is crystal clear.

The following outlines the reasons behind one of the biggest shifts we identified as part of our Thesis 2.0 two years ago.

Like many VCs, we evaluate pitches from countless companies daily. What we’ve noticed is a significant rise in startups that are nearly identical to one another in the same category. Once, I quipped, “This is the fourth one this week—and it’s only Tuesday!”

The reason for this explosion is simple: the cost of starting a software company has plummeted. What once required $1–2M of funding to hire a small team can now be achieved by two founders (or even a solo founder) with little more than a laptop or two and a $20/month subscription to ChatGPT Pro (or your favourite AI coding assistant).

With these tools, founders can build, test, and iterate at unprecedented speeds. The product build-iterate-test-repeat cycle is insanely short. If each iteration is a “shot on goal,” the $1–2M of the past bought you a few shots within a 12–18 month runway. Today, that $20/month can buy you a shot every few hours.

This dramatic drop in costs, coupled with exponentially faster iteration speeds, has led to a flood of startups entering the market in each category. Competition has never been fiercer. This relentless pace also means faster failures, and the startup graveyard is now overflowing.

For early-stage investors, picking winners from this influx of startups has become significantly harder. In the past, you might have been able to identify the category winner out of 10 similar companies. Now, it feels like mission impossible when there are hundreds—or even thousands—of startups in each category. Many of them are even invisible, flying under the radar for much longer because they don’t need to fundraise.

Of course, there will still be many new billion-dollar companies. In fact, I am convinced that this AI-driven platform shift will produce more billion-dollar winners than ever—across virtually every established category and entirely new ones that don’t yet exist. But by the law of large numbers, spotting them among thousands of startups in each category is harder than ever.

If you’re using the same lens that worked in the past to spot and fund these future tech giants, good luck.

That’s why, for a long time now, we’ve been using a very different lens to identify great opportunities with highly defensible moats to stay ahead of the curve. For example, we’ve been exclusively focused on deep tech—a space where we know we have a clear edge. From technology to product to operations, we have the experience to cover the full spectrum and support founders through the unique challenges of building deep tech startups. So far, this approach has been working really well for us.

I guess we are taking our own advice. As a VC firm, we also need to be constantly improving and striving to be unrecognizable every two years!

There’s no doubt the rules of early-stage VC have shifted. How we access, assess, and assist startups has evolved dramatically. The great AI democratization is affecting all sectors, and venture capital is no exception.

For investors who can adapt, this is a time of unparalleled opportunity—perhaps the greatest era yet in tech investing. The playing field has been levelled, and massive disruption (and therefore opportunities) lies ahead. Incumbents are vulnerable, and new champions will emerge in each category – including VC!

Investing during this platform shift is both exciting and challenging. And I wouldn’t want it any other way, because those who figure it out will be handsomely rewarded.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: ABR

The next frontier of AI lies at the edge — where data is generated. By moving AI toward the edge, we unlock real-time, efficient, and privacy-focused processing, opening the door to a wave of new opportunities. One of our most recent investments, Applied Brain Research (ABR), is leading this revolution by bringing “cloud-level” AI capabilities to edge devices.

Why is this important? Billions of power-constrained devices require substantial AI processing. Many of these devices operate offline (e.g., drones, medical devices, and industrial equipment), have access only to unreliable, slow, or high-latency networks (e.g., wearables and smart glasses), or must process data streams in real time (e.g., autonomous vehicles). Due to insufficient on-device capability, the only solution today is to send data to the cloud — a suboptimal or outright infeasible approach.

How does ABR solve this? ABR’s groundbreaking technology addresses these challenges by delivering “cloud-sized” high-performance AI on compact, ultra-low-power devices. This shift is transforming industries such as consumer electronics, healthcare, automotive, and a range of industrial applications, where latency, reliability, energy efficiency, and localized intelligence are essential.

What is ABR’s secret sauce? ABR’s unique approach is rooted in computational neuroscience. Co-founded by Dr. Chris Eliasmith, CTO and Head of the University of Waterloo’s Computational Neuroscience Research Group, ABR leverages a brain-inspired invention called the Legendre Memory Unit (LMU), which was invented by Dr. Eliasmith and his team of researchers. LMUs are provably optimal for compressing time-series data—like voice, video, sensor data, and bio-signals—enabling significant reductions in memory usage. Running the

LMU on ABR’s unique processor architecture has created a breakthrough that “kills three birds with one stone” by:

1. Increasing performance,

2. Reducing power consumption by up to 200x, and

3. Cutting costs by 10x.

This is further turbocharged by ABR’s AI toolchain, which enables customers to deploy solutions in weeks instead of months. Time is money, and ABR’s technology allows for advanced on-device functions—like natural language processing—without relying on the cloud. This unlocks entirely new use cases and possibilities.

At the helm of ABR is Kevin Conley, the CEO and a former CTO of SanDisk, alongside Dr. Chris Eliasmith. Together, they bring exceptionally strong leadership across both hardware and software domains—a rare but powerful combination that gives ABR a significant competitive advantage.

ABR’s vision aligns perfectly with our investment thesis and our belief that edge computing and software-hardware convergence represent the next frontier of opportunity in computing. We’re excited to see ABR power billions of devices in the years to come.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Powerwall 3 and Smart Energy

Those who know me well would tell you I am a pretty boring person. I don’t have many hobbies, but one thing I do love is gadgets. For instance, I’m a big fan of DIY home automation. Practically every electronic device in my house is voice-controlled, automated, and Wi-Fi-connected—if it can be, it probably is. Here’s a fun example:

I love robots doing things for me because, frankly, I’m too busy. 

At this rate, I might run out of IP addresses! Sure, I could change my network’s subnet to enable more, but every time I tinker with my setup, I have to invest time getting everything right again—something I don’t have in abundance. Anyway, I digress.

One gadget I’ve wanted for years but hesitated to get is a home energy storage and backup system, like Tesla’s Powerwall. The Powerwall 2 has been around since 2016, but for years, the Powerwall 3 was “just around the corner,” with rumours of its launch “next month” seemingly every month. I didn’t want to invest in a device I planned to use for a decade only for it to become obsolete right after I bought it.

Finally, the wait is over. Powerwall 3 became available earlier this year, and I’m glad I waited. Its specs—peak power, continuous power, and efficiency—are significantly upgraded from Powerwall 2. That said, I was a little disappointed that its battery capacity remained unchanged.

I’m told this was the first Powerwall 3 installation in Canada, which is pretty exciting! It’s a beautiful piece of technology, though I don’t see much of it since it’s tucked away in the basement. Paired with solar panels, I hope to “off the grid” as much as possible.

As good as the Powerwall 3 is, it’s only part of the solution. While it handles storage and backup very well, it doesn’t provide fine-grained energy monitoring, let alone control. To address this, I also installed a Sense energy monitor. This device, connected to the electrical panel, collects real-time data from electrical currents to identify unique energy signatures for every appliance and device in the home. It’s a hack, a retrofit solution and imperfect, but it’s probably the best option for someone like me, who is entrenched in the Tesla ecosystem.

The energy space hasn’t changed much in the past half-century. Take the electric panel, for example—it’s still essentially the same analog system I remember from my childhood. However, with the rapid acceleration of the energy transition, smarter energy systems are becoming critical as hardware and software converge to enable new possibilities.

A big thanks to James and Dave from the Borealis Clean Energy team for helping me with this project
—and for arriving in style with Canada’s first Cybertruck. The project has so many moving parts. Their expertise made this journey much smoother.

Unboxing PW3!
Zooming in to the power electronics.
The electricians are working hard. It is a big job!
It is done!
A big thank you to James.
This is the Tesla Gateway, a separate box we need to install. It is a smaller box—roughly a quarter of the size of PW3—and where “the brain” is located.
Adding Sense – the orange box – to my old-school electric panel to help me with device-level monitoring.
First Cybertruck in Canada. This thing draws attention.

Fabless + ventureLAB is Cloud Computing for Semiconductors

This is a follow-up blog post to my last piece about Blumind.

More than two decades ago, before I started my first company, I was involved with an internet startup. Back then, the internet was still in its infancy, and most companies had to host their own servers. The upfront costs were daunting—our startup’s first major purchase was hundreds of thousands of dollars in Sun Microsystems boxes that sat in our office. This significant investment was essential for operations but created a massive barrier to entry for startups.

Fast forward to 2006 when we started Wattpad. We initially used a shared hosting service that cost just $5 per month. This shift was game-changing, enabling us to bootstrap for several years before raising any capital. We also didn’t have to worry about maintaining the machines. It dramatically lowered the barrier to entry, democratizing access to the resources needed to build a tech startup because the upfront cost of starting a software company was virtually zero.

Eventually, as we scaled, we moved to AWS, which was more scalable and reliable. Apparently, we were AWS’s first customer in Canada at the time! It became more expensive as our traffic grew, but we still didn’t have to worry about maintaining our own server farm. This significantly simplified our operations.

A similar evolution has been happening in the semiconductor industry for more than two decades, thanks to the fabless model. Fabless chip manufacturing allows companies—large or small—to design their semiconductors while outsourcing fabrication to specialized foundries. Startups like Blumind leverage this model, focusing solely on designing groundbreaking technology and scaling production when necessary.

But fabrication is not the only capital-intensive aspect. There is also the need for other equipment once the chips are manufactured.

During my recent visit to ventureLAB, where Blumind is based, I saw firsthand how these startups utilize shared resources for this additional equipment. Not only is Blumind fabless, but they can also access various hardware equipment at ventureLAB without the heavy capital expenditure of owning it.

Let’s see how the chip performs at -40C!
Jackpine (first tapeout)
Wolf (second tapeout)
BM110 (third tapeout)

The common perception that semiconductor startups are inherently capital-intensive couldn’t be more wrong. The fabless model—in conjunction with organizations like ventureLAB—functions much like cloud computing does for software startups, enabling semiconductor companies to build and grow with minimal upfront investment. For the most part, all they need initially are engineers’ computers to create their designs until they reach a scale that requires owning their own equipment.

Fabless chip design combined with shared resources at facilities like ventureLAB is democratizing the semiconductor space, lowering the barriers to innovation, and empowering startups to make significant advancements without the financial burden of owning fabrication facilities. Labour costs aside, the upfront cost of starting a semiconductor company like Blumind could be virtually zero too.

That’s why the saying, “software once ate the world alone; now, software and hardware consume the universe together,” is becoming true at an accelerated pace. We have already made several investments based on this theme, and we are super excited about the opportunities ahead.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: Blumind

When it comes to watches, my go-to is a Fitbit. It may not be the most common choice, but I value practicality, especially when not having to recharge daily is a necessity to me. My Fitbit lasts about 4 to 5 days—decent, but still not perfect.

Now, imagine if we could extend that battery life to a month or even a year. The freedom and convenience would be incredible. Considering the immense computing demands of modern smartwatches, this might sound far-fetched. But that’s where our portfolio company, Blumind, comes into play.

Blumind’s ultra-low power, always-on, real-time, offline AI chip holds the potential to redefine how we think about battery life and device efficiency. This advancement enables edge computing with extended battery life, potentially lasting years – not a typo – instead of days. Products powered by Blumind can transform user behaviours and empower businesses and individuals to unlock new and impactful value (see our thesis).

Blumind’s secret lies in its brain-inspired, all-analog chip design. The human brain is renowned for its energy-efficient computing abilities. Unlike most modern chips that rely on digital systems and require continuous digital-to-analog and analog-to-digital conversions (which drain power), Blumind’s approach emulates the brain’s seamless analog processing. This unique architecture makes it perfect for power-sensitive AI applications, resulting in chips that could be up to 1000 times more energy-efficient than conventional chips, making them ideal for edge computing.

Blumind’s breakthrough technology has practical and wide-ranging applications. Here are just a few use cases:

Always-on Keyword Detection: Integrates into various devices for continuous voice activation without excessive power usage.

Rapid Image Recognition: Supports always-on visual wake word detection for applications such as access control, enhancing human-device interaction with real-time responses.

Time-Series Data Processing: Processes data streams with exceptional speed for real-time analysis in areas like predictive maintenance, health monitoring, and weather forecasting.

These capabilities unlock new possibilities across multiple industries, including wearables, smart home technology, security, agriculture, medical, smart mobility, and even military and aerospace.

A few weeks ago, I visited Blumind’s team at their ventureLAB office and got an up-close look at their BM110 chip, now in its third tapeout. Blumind exemplifies the future of semiconductor startups through its fabless model, which significantly lowers the initial infrastructure costs associated with traditional semiconductor companies. With resources like ventureLAB supporting them, Blumind has managed to innovate with remarkable efficiency and sustainability. (I’ll share more about the fabless model in an upcoming post.)

I’m thrilled to see where Blumind’s journey leads and how its groundbreaking technology will transform daily life and reshape multiple industries. When devices can go years without needing a recharge instead of mere hours, that’s nothing short of game-changing.

Image: Close-up view of BM110. It is a piece of art!

Image: Qualification in action. Note that BM110 (lower-left corner) is tiny and space-efficient.

Image: The Blumind team is working hard at their ventureLAB office. More on this in a separate blog post here.

Our portfolio company, Blumind, is revolutionizing device efficiency with its ultra-low power, always-on, real-time, offline AI chip. Inspired by the human brain’s energy-efficient computing, Blumind’s innovative all-analog design significantly reduces power consumption, making its chips up to 1000 times more efficient than conventional digital chips. 

This advancement enables edge computing with extended battery life, potentially lasting YEARS - not a typo - instead of days. Practical applications of Blumind’s technology include always-on keyword detection for voice activation, rapid image recognition for access control, and real-time time-series data analysis for predictive maintenance and health monitoring. These capabilities unlock new and previously impossible opportunities across various industries, from wearables and smart homes to security, agriculture, military, and aerospace.

Recently, I visited Blumind’s team at their ventureLAB office and witnessed their  third-tapeout BM110 chip in action. I’m excited to see Blumind’s continued growth and how its transformative technology will reshape industries, making long-lasting, energy-efficient devices a reality.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Bridge Technologies are Rarely Great Investments

More than two decades ago, I co-founded my first company, Tira Wireless. The business went through several iterations, and eventually, we landed on building a mobile content delivery product. We raised roughly $30M in funding, which was a significant amount at the time. We even ranked as Canada’s Third Fastest Growing Technology Company in the Deloitte Technology Fast 50.

We had a good run, but eventually, Tira had to shut its doors.

We made numerous strategic mistakes, and I learned a lot—lessons that, quite frankly, helped me make far better decisions when I later started Wattpad.

One of the most important mistakes we made was falling into the “bridge technology” trap.

What is the “bridge technology” trap?

Reflecting on significant “platform shifts” over recent decades reveals a pattern: each shift unleashes waves of innovation. Consider the PC revolution in the late 20th century, the widespread adoption of the internet and cloud computing in the 2000s, and the mobile era in the 2010s. These shifts didn’t just create new opportunities; they also created significant pain points as the world tried to leap from one technology to another. Many companies emerged to solve problems arising from these changes.

Tira started when the world began its transition from web to mobile. Initially, there were countless mobile platforms and operating systems. These idiosyncrasies created a huge pain point, and Tira capitalized on that. But in a few short years, mobile consolidated into just two major players—iOS and Android. The pain point rapidly disappeared, and so did Tira’s business.

Similarly, most of these “bridge technology” companies perform very well during the transition because they solve a critical, short-term pain point. However, as the world completes the transition, their business disappears. For instance, numerous companies focused on converting websites into iPhone apps when the App Store launched. Where are they now?

Some companies try to leverage what they’ve built and pivot into something new. But building something new is challenging enough, and maintaining a soon-to-be-declining bridge business while transitioning into a new one is even harder. This is akin to the innovator’s dilemma: successful companies often struggle with disruptive innovation, torn between innovating (and risking profitable products) or maintaining the status quo (and risking obsolescence).

As an investor, it makes no sense to invest in a “bridge” company that is fully expected to pivot within a few years. A pivot should be a Plan B, not Plan A. It’s extremely rare for bridge technology companies to become great, venture-scale investments. In fact, I can’t think of any off the top of my head.

We are currently in the midst of a tectonic AI platform shift. We’re seeing a huge volume of pitches, which is incredibly exciting. Many of these startups built great technologies and products. However, a significant number of these pitches also represent bridge technologies. As the current AI platform shift matures, these bridge technologies will lose relevance. Sometimes, it’s obvious they’re bridge technologies; other times, it requires significant thought to identify them. This challenge is intellectually stimulating, and I enjoy every moment of it. Each analysis informs us of what the future looks like, and just as importantly, what it will not look like. With each passing day, we gain stronger conviction about where the world is heading. It’s further strengthening our “seeing the future is our superpower” muscle, and that’s the most exciting part.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: #paid

#paid was one of the first investments we made at Two Small Fish Ventures. It’s been over a decade since we backed Bryan and Adam, who were still working out of Toronto Metropolitan University’s DMZ at the time. They had a vision to build a platform that connected creators and brands before “creator” was even a term! Back then, influencer and creator marketing campaigns were just tiny experiments.

A decade later, the creator economy has taken off. It’s now a $24 billion market—an order of magnitude larger than just a few years ago, with no signs of slowing down. The next wave of growth is still ahead as ad spending continues to shift away from traditional media. With the global ad market approaching $800 billion, one thing remains true: ad dollars follow the eyeballs—always. And where are those eyeballs today? On creators and influencers.

Today, #paid has become the world’s dominant platform, with over 100,000 creators onboard. It addresses a significant challenge: most creators don’t know how to connect with brands, especially iconic brands like Disney, Sephora, or IKEA. On the other hand, brands struggle to find the right creators amidst a sea of talent. #paid bridges this gap, acting as the marketplace that makes collaboration easy. They use data-driven insights to determine what makes a successful match, ensuring that both creators and brands can find each other effortlessly.

At #paid, brands and creators work with a dedicated team of experts to build creative strategies backed by research, first-party data, and industry benchmarks. This means campaigns run smoothly, allowing creators to focus on doing what they love—creating—without getting bogged down by administrative tasks.

I’m not just speaking as an investor—I’ve actually run a campaign with #paid as an influencer myself, and I can personally vouch for how seamless the experience was.

If you think #paid is all about TikTok, Snap, or Instagram, think again. Brands leverage #paid content across every platform. Want proof? Just check out the Infiniti TV commercial, which came from a #paid campaign.

How about billboards in major cities like NYC, Toronto, and more? #paid has that covered too.

#paid also brings creators and marketers together in real life. I had the privilege of speaking at their Creator Marketing Summit in NYC a few weeks ago, and I was amazed at how far #paid has come. The summit brought together hundreds of creators and top brand marketers—an impressive showcase of the platform’s evolution.

Looking back on this journey, here are my key takeaways:

• Great companies take a decade to build.

• To create a category leader, especially in winner-take-all markets, the idea has to be bold and often misunderstood at first. Bryan and Adam saw something that few others did, and their first-mover advantage has solidified #paid’s leading position today.

• There’s no such thing as “done.” #paid constantly reinvents itself. Generative AI is another exciting opportunity for step-function growth, and I can’t wait to see what’s next.

Bryan and Adam should be incredibly proud of what they’ve accomplished.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Axiomatic AI – Make the World’s Information Intelligible

Today’s blog post is brought to you by Eva Lau. She will talk about one of our recent investments: Axiomatic AI.

Congratulations to Axiomatic on their recent US$6M seed round led by Kleiner Perkins! Two Small Fish Ventures is thrilled to be an early investor since the company’s inception—and the only Canadian investor—in what promises to be a game-changer in solving fundamental problems in physics, electronics, and engineering.

Why is this important? Large Language Models (LLMs) excel at languages (as their name suggests) but struggle with logic. That’s why AI can write poetry but struggles with math, as LLMs mainly rely on ‘pattern-matching’ rather than ‘reasoning.’

This is where Axiomatic steps in. The company’s secret sauce is its new AI model called Automated Interpretable Reasoning (AIR), which combines advances in reinforcement learning, LLMs, and world models. Axiomatic’s mission is to create software and algorithms that not only automate processes but also provide clear, understandable insights to fuel innovation and research, ultimately solving real-world problems in engineering and other industrial applications.

The startup is the brainchild of world-renowned professors from MIT, the University of Toronto, and The Institute of Photonic Sciences (ICFO) in Barcelona. The team includes leading engineers, physicists, and computer science experts.

With its innovative models, the startup fits squarely within our fund’s focus: the next frontier of computing and its applications. As all TSF partners are engineers, product experts, and recent operators, we are uniquely positioned to understand the potential of Axiomatic and support the team. 

Axiomatic’s new AIR model is well-positioned to accelerate engineering and scientific discovery, boosting productivity by orders of magnitude in the coming years, and ultimately make the world’s information intelligible.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Openmind Research Institute

I’m excited to share that I have been appointed as a board member of the Openmind Research Institute!

Co-founded by AI and reinforcement learning luminaries Rich Sutton, Randy Goebel and Joseph Modayil, Openmind is a Canadian non-profit focused on conducting fundamental AI research to better understand minds.

We believe the greatest advancements in AI are yet to come. Basic research is essential to understanding what is scientifically possible before pursuing the next generation of commercial and technological developments.

A key aspect of Openmind is its commitment to open research. Openmind places no intellectual property restrictions on its research, allowing everyone to contribute to and build upon this shared knowledge.

As a board member, I will leverage my decades-long experience in building, operating, and investing in AI companies to support Openmind’s mission. Supporting innovation is one of my life’s passions, and I am thrilled to accept this position and join a team dedicated to pioneering advancements in AI.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Viggle AI Leads the Next Wave of Disruption in Content

We’re thrilled to share that Toronto-based Viggle AI, a Canadian start-up revolutionizing character animation through generative AI, has raised US$19 million in funding. The round was led by a16z with Two Small Fish participating as a significant investor. As part of the investment, I also became an advisor to the company. 

Creators are unleashing their creativity with Viggle AI by generating some of the most entertaining memes and videos online. You’ve probably seen a clip of Joaquin Phoenix’s Joker persona replacing recreating Lil Yachy’s walkout from the Summer Smash Festival – it was made with Viggle AI! 

But Viggle AI is much more than a simple meme generator. It’s a powerful platform that can completely reinvent how games, animation, and other videos are produced. 

Powered by JST-1, the first-in-the-world 3D-video foundation model with actual physics understanding, Viggle AI can make any character move as you want. Its unique AI model can generate high-quality, realistic, physics-based 3D animations and videos from either static images or text prompts. 

For professional animation engineers, game designers, and VFX artists, this is game-changing. Viggle AI can streamline the ideation and pre-production process allowing them to focus on their creative vision and ultimately reduce production timelines. 

And, for content creators and everyday users, Viggle AI can generate high-quality animations using simple prompts to create engaging animated character videos within a matter of minutes. 

Easier. Faster. Cheaper. Viggle AI is a truly transformative product that will unlock new values for consumers and professionals alike.  

Here are a couple of fun examples of Viggle AI in action – I was terrible at dancing, but now I can do it!

Since launching in March, Viggle AI has taken the internet by storm and now boasts over 4 million users. When the startup first landed on our radar it only had 1000s of users. This rapid growth is not only a testament to Viggle AI’s ability to create an engaging product but also Two Small Fish’s ability to spot tech giants in the making.  

Two Small Fish has an unparalleled track record of helping create the future of content through technology. After all, the team built Wattpad from a simple app for fiction into a massive global entertainment powerhouse with 100 million users. Seeing the future is our superpower. We’re the best investors to help future tech giants like Viggle AI as they transform how content is created, remixed, customized, consumed, and interacted with. We’re excited to continue to play a role in reinventing content creation and entertainment. 

Congratulations Hang Chu and the entire Viggle AI team! 

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The Next Data Centre: Your Phone

The history of computing has been a constant shift of the centre of gravity.

When mainframe computers were invented in the middle of the last century, they were housed in air-conditioned, room-sized metal boxes that occupied thousands of square feet. People accessed these computers through dumb terminals, which were more like black and white screens and keyboards hooked to the computer through long cables. They were called dumb terminals because the smart part was all on the mainframes.

These computers worked in silos. Computer networks were very primitive. Data was mainly transferred through (physical!) punch cards and tapes.

The business model was selling hardware. During that era, giants like IBM and Wang emerged, and many subsequently submerged.

Hardware was the champion.

Mainframe computers in the 50s. Image source: Wikipedia

The PC era, which started in the 80s and supercharged in the 90s, ended the reign of the mainframe era. As computers became much faster while the price dropped by orders of magnitude, access to computing became democratized, and computers appeared on every desktop. We wanted these computers to talk to each other. Punch cards clearly no longer worked as there were millions of computers now. As a result, LANs (local area networks) were popularized by companies like Novell, which enabled the client/server architecture. Unlike the previous era, the “brains” were decentralized, with clients doing much of the heavy lifting. Servers still played a role, but for the most part, it was for centralized storage.

Although IBM invented the PCs, the business models shifted, creating the duopoly of Intel (and by association companies like Compaq) and Microsoft, with the latter capturing even more value than the former. The software era had begun.

Software became the champion. Hardware was dethroned to the runner-up.

Then, in the late 90s to the 2010s, the (broadband) web, mobile, and cloud computing came along. Connectivity became much less of an issue. Clients, especially your phones, continued to improve at a fast pace, but the capability of servers increased even faster. The “brains” shifted back to the server as that’s where the data is centralized. For the most part, clients were now responsible for user experience, important but merely a means to an end (of collecting data) rather than an end in themselves.

Initially, it appeared that the software-hardware duopoly would continue as companies like Netscape and Cisco were red hot, only to be dethroned by companies like Yahoo and AOL and later Google and Meta. Software and hardware were still crucial, but they became the enablers as the business model once again shifted.

Data became the newly crowned champion.

Fast forward to now, the latest—and arguably the greatest of all time—platform shift, powered by generative AI, is upon us. The ground beneath us is shifting again. On a per-user basis, generative AI demands orders of magnitude more energy. At a time when data centres are already consuming more energy than many countries, it is set to double again in two years to roughly equivalent to the electricity consumption of Japan. The lean startup era is gone. AI startups need to raise much more capital upfront than previous generations of startups because of the enormous cost of compute.

Expecting the server in the data centres to do all the heavy lifting can’t be sustainable in the long term for many reasons. The “brains” have once again started to shift back to the clients at the edge, and it is already happening. For instance, Tesla’s self-driving decisions are not going to make the round trip to its servers. Otherwise, the latency will make the split-second decisions a second too late. Another example, most people may not realize this, but Apple is an edge computing company already as its chips have had AI capabilities for years. Imagine how much more developers can do on your iPhone—at no cost to them—instead of paying a cloud provider to run some AI. That would be the Napster moment for AI companies!

Inevitably, now that almost every device can run some AI and is connected, things will be more decentralized.

In past eras, computing architectures evolved due to the constraints of—or the liberation of—computing capabilities, connectivity, or power consumption. The landscape has once again shifted. Like past platform shifts, there will be a new world order. The playing field will be levelled. Rules will be rewritten. Business models will be reinvented. Most excitingly, new giants will be created.

Every. Single. Time.

Seeing the future is our superpower. That’s why a while ago, at Two Small Fish Ventures, we have already revised our thesis. Now, it is all about investing in the next frontier of computing and its applications, with edge computing an important part of it. Our recent investments have been all-in on this thesis. If you are a founder of an early-stage, rule-rewriting company that is taking advantage of this massive platform shift, don’t hesitate to reach out to us. We love backing category creators in massive market opportunities.

We are all engineers, product builders and company creators. We know how things work. Let’s build the next champion together!

Update: This blog post was published just before Apple announced Apple Intelligence. I knew nothing about Apple Intelligence at that time. It was purely a coincidence. However, it did validate what I said.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The depressing numbers of the venture-capital slump don’t tell the full story

Thank you to The Globe for publishing my second op-ed in as many weeks: The depressing numbers of the venture-capital slump don’t tell the full story.

The piece is now available in full here:

Bright spots in the current venture capital landscape exist. You just need to know where to look.

Recent reports are right. Amid high interest rates, venture capitalists have a shrinking pool of cash to dole out to hopeful startups, making it more challenging for those companies to raise funding. In the United States, for example, startup investors handed out US$ 170.6 billion in 2023, a decrease of nearly 30 percent from the year before.

But the headline numbers don’t tell the whole story.

There’s a night-and-day difference between the experience of raising funds for game-changing, deep-technology startups that specialize in artificial intelligence and related fields, such as semiconductors, and those who try to innovate with what’s referred to as shallow tech.

Remember the late 2000s? Apple’s App Store wasn’t groundbreaking in terms of technical innovation, but it nonetheless deserves praise because it revolutionized the smartphone. Then, the App Store’s charts were dominated by simplistic applications from infamous fart apps to iBeer, the app that let you pretend you were drinking from your iPhone.

That’s the difference – those building game-changing tools and those whose products are simply trying to ride the wave.

Tons of startups are pitching themselves as AI or deep-tech companies, but few actually are. This is why many are having trouble raising funds in the current climate.

It’s also why the era of shallow tech is over, and why deep-tech innovations will reshape our world from here on out.

Toronto-based Ideogram, a deep-tech startup, was the first in the industry to integrate text and typography into AI-generated images. (Disclosure: This is a company that is part of my Two Small Fish Ventures portfolio. But I’m not mentioning it just because I have a stake in it. The company’s track record speaks for itself.)

Barely one year old, the startup has fostered a community of more than seven million creators who have generated more than 600 million images. It went on to close a substantial US$80-million Series A funding round.

As a comparison, Wattpad, the company I founded, which later sold for US$660-million, had raised roughly US$120-million in total. Wattpad’s Series A in 2011, five years since inception, was US$3.5-million.

The speed at which Ideogram achieved so much in such a short period of time is eye-popping.

The “platform shifts” over recent decades have largely played out in the same way. From the personal-computer revolution in the late 20th century to the widespread adoption of the internet and cloud computing in the 2000s, and then the mobile era in the 2010s, there’s a clear pattern.

Each shift unleashed a wave of innovation to create new opportunities and fundamentally reshape user behaviour, democratize access and unlock tremendous value. These shifts benefited the billions of internet users and related businesses, but they also paved the way for “shallow tech.”

The late 2000s marked the beginning of a trend where ease of creation and user experience overshadowed the depth of innovation.

When Instagram launched, it was a straightforward photo-sharing app with just a few attractive filters. Over time, driven by the massive amounts of data it collected, it evolved into one of the leading social media platforms.

This time is different. The AI platform shift makes it harder for simplistic, shallow-tech startups to succeed. Gone are the days of building a minimally viable product, accumulating vast data and then establishing a defensible market position.

We’re entering the golden age of deep-tech innovation, and in order to be successful, startups have to embrace the latest platform shift – AI. And this doesn’t happen by tacking on “AI” to a startup’s name the way many companies did with the “mobile-first” rebrand of the 2010s.

In this new era, technological depth is not just a competitive advantage but also a fundamental pillar for building successful companies that have the potential to redefine our world.

For example, OpenAI and Canada’s very own Cohere are truly game-changing AI companies that have far more technical depth than startups from the previous generation. They’ve received massive funding partly because the development of these kinds of products is very capital-intensive but also because their game-changing approach will revolutionize how we live, work and play.

Companies like these are the bright spots in an otherwise gloomy venture-capital landscape.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Software Once Ate the World Alone; Now, Software and Hardware Consume the Universe Together

Over a decade ago, in his blog post titled “Why Software is Eating the World,” Marc Andreessen explained why software was transforming industries across the globe. Software would no longer be confined to the tech sector but permeated every aspect of our lives, disrupting traditional businesses and creating new opportunities, driving innovation and reshaping the competitive landscape. Overall, the post underscores the profound impact of software on the economy and society at large.

While the prediction in his blog post was mostly accurate, today, the world is still only partially eaten up by software. Although there are many opportunities for software alone to completely transform user behaviour, upend workflow, or cause other disruptions, the low-hanging fruits are mostly picked. That’s why I said the days of shallow tech are behind us now.

Moving forward, increasingly, there will be more and more opportunities that require hardware and software to be designed and developed together from the get-go to ensure that they can work harmoniously and make an impact that otherwise would not be possible. The best example that people can relate to today is Tesla. For those who have driven a Tesla, I trust many would testify that their software and hardware work really well together. Yes, their self-driving software might be buggy. Yes, the build quality of its hardware might not be the best. However, with many features on their cars – from charging to navigation to even warming up the car remotely – you can just tell that they are not shoehorning their software and their app into their hardware or vice versa.

On the other hand, on many cars from other manufacturers, you can tell their software and hardware teams are separated by the Grand Canyon and perhaps only seriously talk to each other weeks before the car is launched 🙂

We also witness the same thing down to the silicon level. From building the next AI chip to the industrial AI revolution to space tech, software and hardware convergence is happening everywhere. For instance, the high energy required by LLMs is partially because the software “works around” the hardware, which was not designed with AI in mind in the first place. Changes are already underway, ensuring that software and hardware dance together. There is a reason why large tech players like OpenAI and Google are planning to make their own chips.

We are in the midst of a once-in-a-decade “platform shift” because of generative AI. In the last platform shift more than a decade ago, when the confluence of mobile and cloud computing created a massive disruption, there was one “iPhone moment,” and then things progressed continuously. This time, new foundation models are launching at a break-neck pace, which is further exacerbated by open-source. So fast that we are now experiencing one iPhone moment every few weeks.

All of this happens when AI-native startups are an order of magnitude more capital-intensive than in the past cycle. At the same time, investors are also willing to write big cheques to these companies, but perhaps it is appropriate, given all the massive opportunities ahead of us.

Investing in this environment is both exciting and challenging as assessing these new opportunities is drastically different from the previous-generation software-only, shallow-tech startup. 

The next few years are going to be wild.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Fear of AI?

Just shared my thoughts, titled “We’re wrong to fear artificial intelligence – real life is not science fiction“, on AI’s transformative impact in The Globe and Mail. Here is the full article:

As an engineer-turned-CEO-turned-investor, I’ve been involved in the AI space long enough that I can anticipate where the technology is headed and witnessed AI’s immense potential and its challenges. But remember, tech often solves its own hurdles. With AI, I see a future of superhuman abilities and new job horizons. Let’s embrace this future.

The piece is now available in full here:

Artificial intelligence has been dominating the headlines lately, and with good reason – AI is a transformative technology that can dramatically change how we live, work and play. Although many of the news stories focus on the potential risks and threats of AI, my intent is to present an alternative perspective.

For context, I was the chief executive officer for more than 15 years at Wattpad, an AI-driven storytelling company that was acquired by Naver in 2021. Now at Two Small Fish Ventures, I invest in many established AI companies, such as Ada and BenchSci, as well as emerging generative AI startups, such as Ideogram.

As an engineer-turned-CEO-turned-investor, I’ve been involved in the AI space long enough that I can anticipate where the technology is headed.

Yes, the technology will also create issues. Broadly, they cluster into three categories:

  • Security – from misinformation to autonomous weapons.
  • Job displacement – the replacement of human workers with machines.
  • Singularity – the point where AI might outwit and elude human control.

But I am confident that AI is a transformational technology that will be a net positive for society. Imposing heavy regulation or a pause today seems an unenforceable overreaction and even stifles creativity for potential solutions.

It’s a truism that novel technologies pose new challenges. Yet the remedy for these challenges is typically found within technology itself.

Take security. We’ve seen the narrative play out many times over. In the early days of the internet, people were (rightfully) very concerned about digitally sharing their credit card information. Over time, the widespread adoption of chip/PIN technology, stronger encryption and, ultimately, the birth of an entire cybersecurity industry addressed most of these challenges. Today, there are several technologies that can detect deep fake videos that would otherwise escape authentication systems. It is not hard to imagine that an uber-advanced cybersecurity industry can nullify emerging AI-related threats.

When it comes to the risk of job displacement, this is also something society has been challenged with time and time again. The Industrial Revolution ushered in both job elimination and creation. Yes, automation erases specific roles, but it concurrently births new ones. There is frictional pain and dislocation in the process, and sometimes, the new jobs go to different people in different places, but over time, the total number of jobs actually goes up substantially. Over all, society has thrived, and we’ve all become more prosperous.

AI will help turn humans into superhumans. Just like electronic spreadsheets didn’t sideline accountants but enhanced their efficiency, AI will supercharge worker productivity and output – a key element for economic growth. Plus, the fast pace of innovation will create new jobs that didn’t exist previously, like AI-prompt engineers – a job title that is less than a year old.

Among the outlined concerns, singularity looms largest, primarily because it’s an unknown frontier. But we’ve tread similar paths and crafted tools and innovations surpassing human abilities. And while some of these innovations had complete destructive potential for humanity (think missiles to bioweapons to nuclear arms), their potential for that has been mostly unrealized. In examining any threats from AI, we should be guided by evidence, not irrational fears born out of science fiction.

There will always be opposing forces and bad actors, but we can assume that humans, ironically with the help of AI, can come up with unprecedented solutions to unprecedented problems, just as we have done before.

From the agrarian age to the industrial age to the information age, society has always thrived and flourished amid disruptions. We shouldn’t expect anything different this time.