The Factory Analogy: Explaining the Next Frontier of Semiconductor Opportunities

The machinery on the assembly line is world-class. On paper, it can produce an enormous volume of goods per hour. And it does.

Yet, the business still misses its targets. Why? Because outcomes are rarely limited by the assembly line itself.

The supply of raw materials to the machinery and the delivery of finished products to customers play equally vital roles. Let me be a bit poetic here:

Raw materials arrive late, the line waits. Finished goods pile high.

In the end, delivery is routes and time.

And when movement is the game, the bill runs high.

Each trip costs energy, in money and time.

This is the central lesson: world-class machinery does not guarantee a high-performing, high-throughput factory.

  • Speed is not just how fast the machinery can produce.
  • Latency is the total time from order to delivery.
  • Energy efficiency is about the total cost of keeping the whole operation moving.

They are related, but not the same problem, and they all contribute to overall performance.

The Compute Analogy

This is a perfect analogy for modern computing. The CPU and GPU are the machinery on the assembly line. They are very good at the arithmetic that turns data into answers.

However, many modern workloads are limited by the supply chain and delivery equivalents in chips and semiconductors. Data has to travel from storage to memory, from memory to the processor, and back again. The raw materials—data—spend a surprising amount of time in transfer before they become finished products in customers’ hands: answers.

That transfer time creates a triple threat to performance:

  • It hurts speed because the processor stalls while waiting to be fed.
  • It hurts latency because the system spends time moving data before it can produce an answer, and then spends time delivering that answer to where it is needed.
  • It hurts energy efficiency because moving bits costs power, dissipates waste heat, and repeated transfers compound the cost.

From Graphics to AI: Changing the Bottleneck

Remember, GPU stands for Graphics Processing Unit. It was originally designed and optimized for graphics—first for gaming in the 1990s—and later for other math-heavy tasks. The bottleneck back then was arithmetic, and GPUs were the solution that gave us the most bang for the buck.

But modern workloads—AI inference in particular—have different characteristics, which makes the bottlenecks show up differently. The computational characteristics of inference put immense pressure on memory behaviour and data transfer. In many cases, the limiting factor is not just the math anymore. It is the movement and the waiting. And the delivery problem is getting bigger too.

The Rise of the “Edge” Factory

Sensors are everywhere now. They generate raw data where the action is. If every sensor stream has to be shipped to a distant “factory” (a data center) before anything useful happens, latency and bandwidth also become part of the product.

That is why edge computing is increasingly important. It is the computing version of building smaller factories closer to customers and shipping less raw material—or sometimes no raw material—across the network.

Investing in New Architectures

Of course, this does not mean CPUs or GPUs are obsolete. It means there are many other bottlenecks now. We need:

  • Less distance between memory and compute.
  • Less shuttling of data inside the system.
  • Less distance between sensing and decision.

The Von Neumann architecture used by many modern computers today is about 80 years old. The first CPU is more than half a century old. The first GPU is almost 30 years old. It is time for new architectures.

This is a core part of our investment thesis in the next frontier of computing and its applications, specifically in Advanced Computing Hardware, one of the five areas we invest in. For many years, we have made investments in semiconductor companies, including Zinite, Hepzibah, ABR, and Blumind, that, through architectural innovation, address performance bottlenecks across speed, latency, and energy efficiency in ways faster GPUs alone will not solve.

We are super excited about this massive opportunity and are looking for new investments. If you are a deep tech researcher or founder in this area, please reach out to us at pitch@twosmallfish.vc.

A Day at Ontario Tech University

I spent a full day at Ontario Tech University in Oshawa a few weeks ago. It was my first time on campus, despite it being just over a 40-minute drive from Toronto, where I live. I arrived curious and left with a clearer picture of what they’re building.

Ontario Tech is still a relatively young university, just over two decades old. What’s less well known—and something I didn’t fully appreciate before the visit—is how quickly it has grown in that time, now serving around 14,000 students, and how deliberately it has established itself as a research university rather than simply a teaching-focused institution.

That research orientation shows up not just in output, but in where the university has chosen to build depth—areas that sit close to real systems and real constraints.

This came through clearly in conversations with Prof. Peter Lewis, Canada Research Chair in Trustworthy Artificial Intelligence, whose work focuses on trustworthy and ethical AI. The university has launched Canada’s first School of Ethical AI, alongside the Mindful AI Research Institute, and the work here is grounded in how AI systems behave once deployed—how humans interact with them, and how unintended consequences are identified and managed.

Energy is another area where Ontario Tech has built serious capability. The university is home to Canada’s only accredited undergraduate Nuclear Engineering program, which is ranked third in North America and designated as an IAEA Collaborating Centre. In discussions with Prof. Hossam Gaber, the emphasis was on smart energy systems, where software, sensing, and control systems are developed alongside the physical energy infrastructure they operate within.

I also spent time with Prof. Haoxiang Lang, whose work in robotics, automotive systems, and advanced mobility sits at the intersection of computation and the physical world.

That work is closely tied to the Automotive Centre of Excellence, which includes a climatic wind tunnel described as one of the largest and most sophisticated of its kind in the world. The facility enables full-scale testing under extreme environmental conditions—from arctic cold to desert heat—and supports research that needs to be validated under real operating constraints.

I can’t possibly mention all the conversations I had over the course of the day—it was a full schedule—but I also spent time with Dean Hossam Kishawy and Dr. Osman Hamid, discussing how research, entrepreneurship, and industry engagement fit together at Ontario Tech.

The day also included time at Brilliant Catalyst, the university’s innovation hub, speaking with students and founders about entrepreneurship. I had the opportunity to give a keynote on entrepreneurship, and the visit ended with the pitch competition, where I handed the cheque to the winning team—a small moment that underscored how early many technical journeys begin.

Ontario Tech may be young, but it is already operating with the structure and discipline of a mature research institution, while retaining the adaptability of a newer one.

Thank you to Sunny Chen and the Ontario Tech team for the time, access, and thoughtful conversations throughout the day.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Dual Use Is the Next Frontier of Deep Tech

I wrote my master’s thesis on Code Division Multiple Access, or CDMA, a wireless communication technology that originated from military needs in World War II. CDMA uses a technique called direct sequence spread spectrum, which spreads a signal across a wide bandwidth so that it appears as random noise. This made it far better at encryption, resisting jamming, and avoiding eavesdropping. Needless to say, it was perfect for military environments long before it found its way into everyday communication.

A startup company called Qualcomm was beginning to commercialize CDMA. I spent countless hours studying their technical papers, which demonstrated how a technology with military grade robustness could also be applied to large scale commercial mobile networks. Working on that thesis in the 90s was also the first time I encountered the idea of dual use, the concept of a technology that can be used in both military and civilian environments, and one that has existed since the post–World War II era.

Geopolitics Has Recentered Dual Use

Fast forward to today. Geopolitics has returned to the foreground. Defence budgets around the world are rising. Countries are rethinking supply chains and rediscovering the importance of technological sovereignty. The focus is no longer only on wartime capability but also on the resilience of civilian systems that society relies on every day.

In this environment, dual use has moved from the background to the forefront of national strategy. In the AI era we are in, governments everywhere are looking for new technologies that strengthen national security and economic competitiveness at the same time. Technologies that once seemed far removed from defence are now recognized as essential.

A Tailwind for Deep Tech

For Two Small Fish Ventures, none of this comes as a surprise. Deep tech has always lived at the intersection of what is scientifically hard and what is societally important. Today, it naturally lends itself to dual use.

Breakthroughs in the five areas that TSF invests in — vertical AI platforms, physical AI, AI infrastructure, advanced computing hardware, and smart energy — were never designed to be solely military. Yet many of these technologies have clear applications in resilience, cybersecurity, automation, sensing, communication, and energy stability.

In other words, dual use does not narrow a company’s mission. It broadens it. It is the rare case where one innovation can truly kill two birds with one stone.

Defence Technology Is Not Only About Weapons

There is a common misconception that defence technology refers only to weapons. That has never been true.

Most technologies are neutral. I am certain our national defence department uses Microsoft Office, for instance. This is a reminder that much of what defence departments buy is not lethal but operational.

To be clear, we do not invest in companies whose sole purpose is military lethal weapons systems.

Our focus remains on building companies in the areas where we believe the next frontier of computing is taking shape. When those technologies also support national resilience, that is not mission drift. It is simply the nature of deep tech.

Deep tech requires scientific and engineering breakthroughs that are difficult to copy. In a dual use environment, this becomes an essential advantage.

A New Frontier for Founders

Founders often think of defence as a separate world. That is changing. Defence is a complicated beast, and anyone who believes they can simply walk in will be disappointed. But for those who understand the landscape and can navigate it, this is a generational opportunity waiting to be captured.

When I first studied CDMA decades ago, I never imagined that a communication technique developed for the battlefield would become the backbone of commercial wireless networks.

Today, many deep tech founders are standing at a similar moment. For founders and investors in deep tech, this is the beginning of an important cycle. And we are excited to support the innovators who will define what comes next.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Geopolitics Now Matters to Every CEO

In October, at our Two Small Fish Ventures AGM, I had the chance to sit down with Benjamin Bergen for a fireside chat. At the time, he was still leading the Council of Canadian Innovators. None of us knew he would soon become the new CEO of the CVCA. Looking back, the timing could not have been better.

I have known Benjamin for many years. When I was CEO of Wattpad, I worked closely with him through CCI, which played an important role in advocating for Canadian scaleups. That experience gave me a front row view of how policy, talent mobility, capital, and global markets intersect. I did not expect that perspective to become even more useful on the investor side, but today it is proving to be exactly that.

At Two Small Fish, our portfolio founders often hear us talk about our full cycle view of company building. We have built companies, operated them at global scale, navigated regulatory and geopolitical realities, and now invest across deep tech. We have seen the journey from the very first product decision all the way to commercialization. That experience matters today because geopolitics is no longer something happening far away. It is showing up directly in the work of founders.

The World Has Changed Irreversibly

Founders do not necessarily always think about politics, especially geopolitics. I certainly did not in my early days as a founder. But over the past year, the global environment has shifted in ways that affect talent, capital, customers, supply chains, and data. These forces are becoming part of the operating conditions for every innovative company.

At the AGM, Benjamin and I spent time unpacking what this new reality looks like.

  • Talent We spoke about the growing brain drain and how global mobility is changing. The tightening of the H1B program in the United States has created a ripple effect across the entire talent ecosystem. Early stage companies are rethinking where they build teams, and immigration policy is becoming a strategic consideration rather than an afterthought.
  • Capital The rise of protectionism and shifting global alliances are affecting how and where capital can move. The changing dynamics among the United States, China, and Canada raise new questions for both founders and investors. Some are beginning to view geographic diversification as a practical response to political uncertainty.
  • Customers National preference policies such as Buy Canadian and Buy American are becoming more common. These policies may begin as political statements, but they influence real procurement and partnership decisions. For founders, gaining early customers is no longer just about product and timing. There is a political dimension that needs to be understood.
  • Infrastructure and Defense We also talked about how export controls and security requirements are expanding. Technologies that once seemed purely commercial are now viewed through a strategic lens. Even young companies are discovering that they may be operating in areas that governments consider sensitive.
  • Supply Chains Global supply chains have shown their fragility in areas such as semiconductors, rare earth materials, and energy. These vulnerabilities create friction but also open new opportunities for companies building more resilient and regional alternatives.
  • Data Sovereignty Data localization and national data governance rules continue to spread. More countries want their data stored and processed within their borders. For companies operating internationally, this introduces new architectural and operational decisions much earlier in the journey.

Benjamin also shared how CCI’s new advisory group, Signa Strategies, is helping founders navigate exactly these types of challenges. It felt like a natural evolution of the work he has been doing for years.

As our conversation wrapped up, I was reminded how valuable it is to have seen this ecosystem from both sides. As a founder, I saw how talent, markets, and policy could quietly redirect a company’s path. Through CCI, I saw how national priorities and regulation shape the environment innovators work in. These experiences feel especially relevant now. The geopolitical questions that once appeared at the edges are moving closer to the center.

This is the environment founders are building in today. And with our full cycle experience, we hope to help them navigate it with clarity, context, and confidence.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Reflections from the Impact 2025 Summit

I had the opportunity to join a panel at the Impact 2025 Summit in Calgary, moderated by Raissa Espiritu, with Janet Bannister and Paul Godman. Ironically, none of us are labelled as impact investors, and I explained on stage why Two Small Fish Ventures does what we do.

At Two Small Fish Ventures, we’ve never called ourselves an impact fund. That’s not because we’re indifferent to impact; in fact, it’s core to what we do. Our focus is on deep tech, the next frontier of computing, where innovation can create meaningful, long-term change. Specifically, we invest in five key areas: Vertical AI Platforms, Physical AI, AI Infrastructure, Advanced Computing Hardware, and Smart Energy.

We care deeply about scientific advancement, and more importantly, about turning those breakthroughs into real-world impact. That’s how meaningful progress happens.

Eva is our General Partner, and both of us are immigrants. Diversity isn’t a marketing point for us; it’s part of who we are. It naturally shows up in our portfolio: about half of our companies have at least one female founder, and many come from underrepresented backgrounds. That said, uncompromisingly, we back amazing deep tech founders who are turning their creations into world-class companies.

It’s actually rare that we talk about topics like women investing or investing in underrepresented groups in isolation. Not because we don’t care, quite the opposite. The fact that Eva is one of the few female GPs leading a venture fund, and that we’re both immigrants, already says a lot. Our actions speak volumes. We walk the walk and talk the talk.

We need to deliver results. Period. Our competition isn’t other venture funds; it’s every other investment opportunity available in the market. If we can’t perform at the highest level — top decile in everything we do — we can’t sustain our mission. Delivering some of the best results in the industry enables us to do what we love and make an impact.

That’s why I believe impact and performance are not opposites. The most powerful kind of impact happens when companies succeed, when they become world-class companies. Strong returns and meaningful impact can, and should, reinforce each other.

I also talked about the importance of choosing the right vehicle for the right purpose. When we made a 2 million dollar donation to the University of Toronto to establish the Commercialization Catalyst Prize, it wasn’t about investing. It was about supporting a different kind of impact — helping scientists and engineers turn their research into innovations that can reach the world. Not every kind of impact should come from the same tool.

At the end of the day, labels matter less than intent and execution. We don’t need to call ourselves an impact fund to make a difference. Our goal is simple: to back bold deep tech founders using science and technology to build a better future and to do it with excellence.

A big thank you to Raissa, George Damian, Sylvia Wang, and the entire Platform Calgary team for putting together such a thoughtful and well-run event.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Quantum: From Sci-Fi to Investable Frontier

When I was studying electrical engineering, out of my curiosity, I chose to take an elective course on quantum physics as part of advanced optics. It sparked my curiosity in quantum. The strange, abstract, counterintuitive rules, for example particles existing in multiple states or being entangled across distance, captivated me.

Error correction, closely related to fault tolerance in quantum systems today, is the backbone of telecommunications, one of the areas I majored in.

Little did I know these domains would converge in such a way that my earlier academic training would become relevant again years later.

For me, computing is not just my profession, it is also my hobby. As a science nerd, I actively enjoy following advances, and I keep going deeper down the rabbit hole of the next frontier of computing. That mix of personal curiosity and professional focus shapes how I approach both the opportunities and risks in the space. Over the past few years, I have gone deeper into the world of quantum. My academic and professional background gave me the footing to evaluate both what is technically possible and what is commercially viable.

From If to How and When

In June, I wrote Quantum Isn’t Next. It’s Now. We have passed the tipping point where the question is no longer if quantum technology will work, it is how and when it will scale.

This momentum is not just visible to those of us deep in the field. As the Globe and Mail recently reported, we at Two Small Fish have been following quantum for years, but did not think it was mature enough for an early-stage fund with a 10-year lifespan to back. This year, we changed our minds. As I shared in that article: “It’s much more investible now.”

The distinction is clear: when quantum was still a science problem, the central question was whether it could work at all. Now that it has become an engineering problem, the questions are how it will work at scale and when it will be ready for commercialization.

This shift matters for investors. Venture capital focuses on engineering breakthroughs, hard, uncertain, but achievable on a commercialization timeline. Fundamental science, which can take many more years to mature, is better supported by governments, universities, and non-dilutive funding sources. I will leave that discussion for another post.

One of Five Frontiers

At Two Small Fish Ventures, we have identified five areas shaping the next frontier of computing. Quantum falls under the area of advanced computing hardware, where the convergence of different areas of science, engineering, and commercialization is accelerating.

Each of these areas is no longer a speculative science experiment but a rapidly advancing field where engineering and commercialization are converging. Within the next ten years, the winners will emerge from lab prototypes and become scaled companies. Quantum is firmly on that trajectory.

How We Invest in Quantum

Our first principle at Two Small Fish is straightforward: we only invest in things we truly understand, from all three technology, product, and commercialization lenses. That discipline forces us to dig deep before committing capital. And after years of study, it is clear to us that quantum has moved into investable territory, but only selectively.

Not every quantum startup fits a venture time horizon. Some promising projects will take too many years to scale. But we are now seeing opportunities that, within a 10-year window, can realistically grow from an early-stage idea to a successful scale-up. That is the standard we apply to every investment, and quantum finally has companies that meet it.

From Sci-Fi to Reality

Canada has played an outsized role in building the foundation of quantum science. Now, it has the chance to lead in quantum commercialization. The next few years will determine which teams turn breakthrough science into enduring companies.

For investors, this is both an opportunity and a responsibility. The quantum era is not a distant possibility, it is here now. What once sounded like science fiction is now an investable reality. And for those willing to put in the work to understand it, the frontier is already here.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: Axiomatic

Last year we invested in Axiomatic AI. Their mission is to bring verifiable and trustworthy AI into science and engineering, enabling innovation in areas where rigour and reliability are essential. At the core of this is Mission 10×30: achieving a tenfold improvement in scientific and engineering productivity by 2030.

The company was founded by top researchers and professors from MIT, the University of Toronto, and ICFO in Barcelona, bringing deep expertise in physics, computer science, and engineering.

Since our investment, the team has been heads down executing. Now they’ve shared their first public release: Axiomatic Operators.

What They’ve Released

Axiomatic Operators are MCP servers that run directly in your IDE, connecting with systems like Claude Code and Cursor. The suite includes:

  • AxEquationExplorer
  • AxModelFitter
  • AxPhotonicsPreview
  • AxDocumentParser
  • AxPlotToData
  • AxDocumentAnnotator

Why is this important?

Large Language Models (LLMs) excel at languages (as their name suggests) but struggle with logic. That’s why AI can write poetry but often has trouble with math — LLMs mainly rely on pattern matching rather than reasoning.

This is where Axiomatic steps in. Their approach combines advances in reinforcement learning, LLMs, and world models to create AI that is not just fluent but also capable of reasoning with the rigour required in science and engineering.

What’s Next

This first release marks an important step in turning their mission into practical, usable tools. In the coming weeks, the team will share more technical material — including white papers, demo videos, GitHub repositories, and case studies — while continuing to work closely with early access partners.

Find out more on GitHub, including demos, case studies, and everything else you need to make your work days less annoying and more productive: Axiomatic AI GitHub

We’re excited to see their progress. If you’re in science or engineering, we encourage you to give the Axiomatic Operators suite a try: Axiomatic AI.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Jevons Paradox: Why Efficiency Fuels Transformation

In 1865, William Stanley Jevons, an English economist, observed a curious phenomenon: as steam engines in Britain became more efficient, coal use didn’t fall — it rose. Efficiency lowered the cost of using coal, which made it more attractive, and demand surged.

That insight became known as Jevons Paradox. To put it simply:

  • Technological change increases efficiency or productivity.
  • Efficiency gains lead to lower consumer prices for goods or services.
  • The reduced price creates a substantial increase in quantity demanded (because demand is highly elastic).

Instead of shrinking resource use, efficiency often accelerates it — and with it, broader societal change.

Coal, Then Light

The paradox first appeared in coal: better engines, more coal consumed. Electricity followed a similar path. Consider lighting in Britain:

PeriodTrue price of lighting (per million lumen-hours, £2000)Change vs. startPer-capita consumption (thousand lumen-hours)Change vs. startTotal consumption (billion lumen-hours)Change vs. start
1800£8,0001.118
1900£250↓ ~30×255↑ ~230×10,500↑ ~500×
2000£2.5↓ ~3,000× (vs. 1800) / ↓ ~100× (vs. 1900)13,000↑ ~13,000× (vs. 1800) / ↑ ~50× (vs. 1900)775,000↑ ~40,000× (vs. 1800) / ↑ ~74× (vs. 1900)

Over two centuries, the price of light fell 3,000×, while per-capita use rose 13,000× and total consumption rose 40,000×. A textbook case of Jevons Paradox — efficiency driving demand to entirely new levels.

Computing: From Millions to Pennies

This pattern carried into computing:

YearCost per GigaflopNotes
1984$18.7 million (~$46M today)Early supercomputing era
2000$640 (~$956 today)Mainstream affordability
2017$0.03Virtually free compute

That’s a 99.99%+ decline. What once required national budgets is now in your pocket.

Storage mirrored the same story: by 2018, 8 TB of hard drive storage cost under $200 — about $0.019 per GB, compared to thousands per GB in the mid-20th century.

Connectivity: Falling Costs, Rising Traffic

Connectivity followed suit:

YearTypical Speed & Cost per Mbps (U.S.)Global Internet Traffic
2000Dial-up / early DSL (<1 Mbps); ~$1,200~84 PB/month
2010~5 Mbps broadband; ~$25~20,000 PB/month
2023100–940 Mbps common; ↓ ~60% since 2015 (real terms)>150,000 PB/month

(PB = petabytes)

As costs collapsed, demand exploded. Streaming, cloud services, social apps, mobile collaboration, IoT — all became possible because bandwidth was no longer scarce.

Intelligence: The New Frontier

Now the same dynamic is unfolding with intelligence:

YearCost per Million TokensNotes
2021~$60Early GPT-3 / GPT-4 era
2023~$0.40–$0.60GPT-3.5 scale models
2024< $0.10GPT-4o and peers

That’s a two-order-of-magnitude drop in just a few years. Unsurprisingly, demand is surging — AI copilots in workflows, large-scale analytics in enterprises, and everyday generative tools for individuals.

As we highlighted in our TSF Thesis 3.0, cheap intelligence doesn’t just optimize existing tasks. It reshapes behaviour at scale.

Why It Matters

The recurring pattern is clear:

  • Coal efficiency fueled the Industrial Revolution.
  • Affordable lighting built electrified cities.
  • Cheap compute and storage enabled the digital economy.
  • Low-cost bandwidth drove streaming and cloud collaboration.
  • Now cheap intelligence is reshaping how we live, work, and innovate.

As we highlighted in Thesis 3.0:

“Reflecting on the internet era… as ‘the cost of connectivity’ steadily declined, productivity and demand surged—creating a virtuous cycle of opportunities. The AI era shows remarkable parallels. AI is the first technology capable of learning, reasoning, creativity… Like connectivity in the internet era, ‘the cost of intelligence’ is now rapidly declining, while the value derived continues to surge, driving even greater demand.”

The lesson is simple: efficiency doesn’t just save costs — it reorders economies and societies. And that’s exactly what is happening now.

If you are building a deep tech early-stage startup in the next frontier of computing, we would like to hear from you. This is a generational opportunity as both traditional businesses and entirely new sectors are being reshaped. White-collar jobs and businesses, in particular, will not be the same. We would love to hear from you.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Five Areas Shaping the Next Frontier

The cost of intelligence is dropping at an unprecedented rate. Just as the drop in the cost of computing unlocked the PC era and the drop in the cost of connectivity enabled the internet era, falling costs today are driving explosive demand for AI adoption. That demand creates opportunity on the supply side too, in the infrastructure, energy, and technologies needed to support and scale this shift.

In our Thesis 3.0, we highlighted how this AI-driven platform shift will reshape behaviour at massive scale. But identifying the how also means knowing where to look.

Every era of technology has a set of areas where breakthroughs cluster, where infrastructure, capital, and talent converge to create the conditions for outsized returns. For the age of intelligent systems, we see five such areas, each distinct but deeply interconnected.

1. Vertical AI Platforms

After large language models, the next wave of value creation will come from Vertical AI Platforms that combine proprietary data, hard-to-replicate models, and orchestration layers designed for complex and large-scale needs.

Built on unique datasets, workflows, and algorithms that are difficult to imitate, these platforms create proprietary intelligence layers that are increasingly agentic. They can actively make decisions, initiate actions, and shape workflows. This makes them both defensible and transformative, even when part of the foundation rests on commodity models.

This shift from passive tools to active participants marks a profound change in how entire sectors operate.

2. Physical AI

The past two decades of digital transformation mostly played out behind screens. The next era brings AI into the physical world.

Physical AI spans autonomous devices, robotics, and AI-powered equipment that can perceive, act, and adapt in real environments. From warehouse automation to industrial robotics to autonomous mobility, this is where algorithms leave the lab and step into society.

We are still early in this curve. Just as industrial machinery transformed factories in the nineteenth century, Physical AI will reshape industries that rely on labour-intensive, precision-demanding, or hazardous work.

The companies that succeed will combine world-class AI models with robust hardware integration and build the trust that humans place in systems operating alongside them every day.

3. AI Infrastructure

Every transformative technology wave has required new infrastructure that is robust, reliable, and efficient. For AI, this means going beyond raw compute to ensure systems that are secure, safe, and trustworthy at scale.

We need security, safety, efficiency, and trustworthiness as first-class priorities. That means building the tools, frameworks, and protocols that make AI more energy efficient, explainable, and interoperable.

The infrastructure layer determines not only who can build AI, but who can trust it. And trust is ultimately what drives adoption.

4. Advanced Computing Hardware

Every computing revolution has been powered by a revolution in hardware. Just as the transistor enabled mainframes and the microprocessor ushered in personal computing, the next era will be defined by breakthroughs in semiconductors and specialized architectures.

From custom chips to new communication fabrics, hardware is what makes new classes of AI and computation possible, both in the cloud and on the edge. But it is not only about raw compute power. The winners will also tackle energy efficiency, latency, and connectivity, areas that become bottlenecks as models scale.

As Moore’s Law hits its limit, we are entering an age of architectural innovation with neuromorphic computing, photonics, quantum computing, and other advances. Much like the steam engine once unlocked new industries, these architectures will redefine what is computationally possible. This is deep tech meeting industrial adoption, and those who can scale it will capture immense value.

5. Smart Energy

Every technological leap has demanded a new energy paradigm. The electrification era was powered by the grid. Today, AI and computing are demanding unprecedented amounts of energy, and the grid as it exists cannot sustain this future.

This is why smart energy is not peripheral, but central. From new energy sources to intelligent distribution networks, the way we generate, store, and allocate energy is being reimagined. The idea of programmable energy, where supply and demand adapt dynamically using AI, will become as fundamental to the AI era as packet switching was to the internet.

Here, deep engineering meets societal need. Without resilient and efficient energy, AI progress stalls. With it, the future scales.

Shaping What Comes Next

The drop in the cost of intelligence is driving demand at a scale we have never seen before. That demand creates opportunity on the supply side too, in the platforms, hardware, energy, physical systems, and infrastructure that make this future possible.

The five areas — Vertical AI Platforms, Physical AI, AI Infrastructure, Advanced Computing Hardware, and Smart Energy — represent the biggest opportunities of this era. They are not isolated. They form an interconnected landscape where advances in one accelerate breakthroughs in the others.

We are domain experts in these five areas. The TSF team brings technical, product and commercialization expertise that helps founders build and scale in precisely these spaces. We are uniquely qualified to do so.

At Two Small Fish, this is the canvas for the next generation of 100x companies. We are excited to partner with the founders building in these areas globally, those who not only see the future, but are already shaping it.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Backing the Scientists Who Helped Invent Blockchain with SureMark Digital

A few years back, Eva met Dr. Scott Stornetta. Later, I did too. Alongside Dr. Stuart Haber, Scott is widely credited as the creator of blockchain. Blockchain is a technology built on a simple but radical idea at the time: decentralization. No single authority, no central point of control, just a trusted system everyone can rely on.

Now, these two scientists are teaming up again to start a new company, SureMark Digital. Their mission is to bring that same decentralized philosophy to identity and authenticity on the internet, enabling anyone to prove who they are, certify their work, and push back against deepfakes and impersonation. No middlemen. No central gatekeepers.

It took us about 3.141592654 seconds to get excited. We are now proud to be the co-lead investor in SureMark’s first institutional round.

At Two Small Fish, we love backing frontier tech that can reshape large-scale behaviour. SureMark checks every box.

Eva has written a deeper dive on what they are building and why it matters. You can read it here.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Masterclass Series: The Triathlon Rule of Deep Tech Startups

A swimming world champion, a cycling champion, and a marathon champion each tried their hand at a triathlon.

None of them even came close to the podium. All were easily defeated.

Why?

Because the swimming champion could not bike, nor could he run fast.

The cycling champion did not swim well.

The marathon runner was painfully slow in the water.

The winner?

It was someone who had been humbled by the swimming champion in the pool for years, finishing second in the world championships multiple times. He was an exceptional swimmer, yes. However, he could also bike fast and run hard. Not the best in any single discipline, but strong across all three. And that is what won him the race.

The takeaway:

To win in triathlon, you need to be competitive in all three disciplines.

The winner is often world class in one of them, but they must be very good if not great at the other two.

This is the same mistake many first time deep tech founders make.

They believe that superior technology alone is enough to win.

It is not.

While technology is crucial, and in fact it is table stakes and the foundation of innovation, it must be transformed into a usable product. If it does not solve a real problem in a way people can adopt and benefit from, its brilliance is wasted.

And even if you have built world class technology and a beautifully crafted product, you are still not done. Without effective commercialization, which includes distribution, pricing, sales, positioning, and partnerships, you will not reach the users or customers who need what you have built.

I wrote more about this in The Three Phases of Building a Great Tech Company: Technology, Product, and Commercialization. Each phase demands different skills. Each must be taken seriously.

Neglecting any one of them is like trying to win a triathlon without training for the bike or the run.

Just like a triathlete must train in all three disciplines, a founder must excel across all three pillars:

  • Great and defensible technology
  • An excellent product
  • Execution on commercialization

You need all three.

That is how you win the world championship.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Computing. Then Connectivity. Then Intelligence. For Half a Century, Cost Collapses Drove Massive Adoption.

In the history of human civilization, there have been several distinct ages: the Agricultural Age, the Industrial Age, and the Information Age, which we are living in now.

Within each age, there are different eras, each marked by a drastic drop in the cost of a fundamental “atomic unit.” These cost collapses triggered enormous increases in demand and reshaped society by changing human behaviour at scale.

From the late 1970s to the 1990s, the invention of the personal computer drastically reduced the cost of computing [1]. A typical CPU in the early 1980s cost hundreds of dollars and ran at just a few MHz. By the 1990s, processors were orders of magnitude faster for roughly the same price, unlocking entirely new possibilities like spreadsheets and graphical user interfaces (GUIs).

Then, from the mid-1990s to the 2010s, came the next wave: the Internet. It brought a dramatic drop in the cost of connectivity [2]. Bandwidth, once prohibitively expensive, fell by several orders of magnitude — from over $1,200 per Mbps per month in the ’90s to less than a penny today. This enabled browsers, smartphones, social networks, e-commerce, and much of the modern digital economy.

From the mid-2010s to today, we’ve entered the era of AI. This wave has rapidly reduced the cost of intelligence [3]. Just two years ago, generating a million tokens using large language models cost over $100. Today, it’s under $1. This massive drop has enabled applications like facial recognition in photo apps, (mostly) self-driving cars, and — most notably — ChatGPT.

These three eras share more than just timing. They follow a strikingly similar pattern:

First, each era is defined by a core capability, i.e. computing, connectivity, and intelligence respectively.

Second, each unfolds in two waves:

  • The initial wave brings a seemingly obvious application (though often only apparent in hindsight), such as spreadsheets, browsers, or facial recognition.
  • Then, typically a decade or so later, a magical invention emerges — one that radically expands access and shifts behaviour at scale. Think GUI (so we no longer needed to use a command line), the iPhone (leapfrogging flip phones), and now, ChatGPT.

Why does this pattern matter?

Because the second-wave inventions are the ones that lower the barrier to entry, democratize access, and reshape large-scale behaviour. The first wave opens the door; the second wave throws it wide open. It’s the amplifier that delivers exponential adoption.

We’ve seen this movie before. Twice already, over the past 50 years.

The cost of computing dropped, and it transformed business, productivity, and software.

Then the cost of connectivity dropped, and it revolutionized how people communicate, consume, and buy.

Now the cost of intelligence is collapsing, and the effects are unfolding even faster.

Each wave builds on the last. The Internet era was evolving faster than the PC era because the former leveraged the latter’s computing infrastructure. AI is moving even faster because it sits atop both computing and the Internet. Acceleration is not happening in isolation. It’s compounding.

If it feels like the pace of change is increasing, it’s because it is.

Just look at the numbers:

  • Windows took over 2 years to reach 1 million users.
  • Facebook got there in 10 months.
  • ChatGPT did it in 5 days.

These aren’t just vanity metrics — they reflect the power of each era’s cost collapse to accelerate mainstream adoption.

That’s why it’s no surprise — in fact, it’s crystal clear — that the current AI platform shift is more massive than any previous technological shift. It will create massive new economic value, shift wealth away from many incumbents, and open up extraordinary investment opportunities.

That’s why the succinct version of our thesis is:

We invest in the next frontier of computing and its applications, reshaping large-scale behaviour, driven by the collapsing cost of intelligence and defensible through tech and data moats.

(Full version here).

The race is already on. We can’t wait to invest in the next great thing in this new era of intelligence.

Super exciting times ahead indeed.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!


Footnotes

[1] Cost of Computing

In 1981, the Intel 8088 CPU (used in the first IBM PC) had a clock speed of 4.77 MHz and cost ~$125. By 1995, the Intel Pentium processor ran at 100+ MHz and cost around $250 — a ~20x speed gain at similar cost. Today’s chips are thousands of times faster, and on a per-operation basis, exponentially cheaper.

[2] Cost of Connectivity

In 1998, bandwidth cost over $1,200 per Mbps/month. By 2015, that figure dropped below $1. As of 2024, cloud bandwidth pricing can be less than $0.01 per GB — a near 100,000x drop over 25 years.

[3] Cost of Intelligence

In 2022, generating 1 million tokens via OpenAI’s GPT-3.5 could cost $100+. In 2024, it costs under $1 using GPT-4o or Claude 3.5, with faster performance and higher accuracy — a 100x+ reduction in under two years.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Quantum Isn’t Next. It’s Now.

In the early 2000s, it was a common joke in the tech world that “next year is the year of the smartphones.” People kept saying it over and over for almost a decade. It became a punchline. The industry nearly lost its credibility.

Until the iPhone launched. “Next year is the year of the smartphones” finally became true.

The same joke has followed quantum for the past ten years: next year is the year of quantum.

Except it hasn’t been. Not yet.

And yet, quietly, the foundations have been built. We’re not there, but we’re far from where we started.

We’re getting closer. Much closer. I can smell it. I can hear it. I can sense it.

Right now, without getting into too much technical detail, we’re still at a small scale: fewer than 100 usable qubits. Commercial viability likely requires thousands, if not millions. The systems are still too error-prone, and hosting your own quantum machine is wildly impractical. They’re expensive, fragile, and noisy.

At this stage, quantum is mostly limited to niche or small-scale applications. But step by step, quantum is inching closer to broader utility.

And while these things don’t progress in straight lines, the momentum is real and accelerating.

Large-scale, commercially deployable, fault-tolerant quantum computers accessed through the cloud are no longer science fiction. They’re within reach.

I spent a few of my academic years in signal processing and error correction. I’ve also spent a bit of time studying quantum mechanics. I understand the challenges of cloud-based access to quantum systems, and I’ve been following the field for quite a while, mostly as a curious science nerd.

All of that gives me reason to trust my sixth sense. Quantum is increasingly becoming a reality.

Nobody knows exactly when the iPhone moment or the ChatGPT moment of quantum will happen.
But I’m absolutely sure we won’t still be saying “next year is the year of quantum” a decade from now.

It will happen, and it will happen much sooner than you might think.

At Two Small Fish, our thesis is centred around the next frontier of computing and its applications.

This is an exciting time and the ideal time to take a closer look at quantum, because the best opportunities tend to emerge right before the technology takes off.

How can we not get excited about new quantum investment opportunities?

P.S. I’m excited to attend the QUANTUM NOW conference this week in Montreal. Also thrilled to see Mark Carney name quantum as one of Canada’s official G7 priorities. That short statement may end up being a big milestone.

P.P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Announcing TSF’s Investment in ENVGO

Humans have conquered land, sea, and space.

Yet the ocean remains surprisingly underdeveloped — in fact, it’s the least developed.

Land transportation has been electrified. In space, payload costs have dropped drastically. Now, it’s time for marine to catch up.

Unlike cars, you can’t simply add an electric motor and battery to a boat and make it work. Why? One reason is that water’s viscosity is much higher than air, meaning drag or resistance is an order of magnitude greater. As a result, replacing a gas motor with an electric one would require a gigantic battery, making it impractical and, frankly, unusable. That’s why marine electrification has lagged.

Until now. 

The “iPhone moment” of marine transportation has arrived. ENVGO’s hydrofoiling NV1 tackles these multidisciplinary complications head-on. Led by successful serial entrepreneur Mike Peasgood, the team brings together expertise in AI, robotics, control systems, computer vision, autonomous systems, and more. Leveraging their prior success as drone pioneers at Aeryon, they are now building a flying robot — on water.

It’s day one of a large-scale transformation of marine transportation. Two Small Fish is privileged and super excited to lead this round of funding, alongside our good friends at Garage, who are also participating. We can’t wait to see how ENVGO reimagines the uncharted waters — pun fully intended.

Read our official blog post by our partner Albert here

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Wattpad Was My Regular Season. TSF Is My Playoff Hockey

When entrepreneurs exit their companies, it is supposed to be a victory lap. But in reality, many find themselves in an unexpected emotional vacuum. More often than you might think, I hear variations of the same quiet confession:

“It should have been the best time of my life. But I felt lost after the exit. I lost my purpose.”

After running Wattpad for 15 years, I understand this all too well. It is like training for and running a marathon for over a decade, only to stop cold the day after the finish line. No more rhythm. No more momentum. No next mile.

Do I Miss Operating

Unsurprisingly, people often ask me:

“Do you like being a VC?”

“Do you miss operating?”

My honest answer is yes and yes

(but I get my fix without being a CEO — see below).

Being a founder and CEO was deeply challenging and also immensely rewarding. It is a role that demands a decade-long commitment to building one and only one thing. And while I loved my time as CEO, I did not feel the need to do it again. Once in a lifetime was enough. I have started three companies. A fourth would have felt repetitive.

What I missed most was not the title or the responsibility. It was the people. The team. The day-to-day collaboration with nearly 300 passionate employees when I stepped down. That sense of shared mission — of solving hard problems together — was what truly filled my cup.

Back in the Trenches in a Different Role

Now at Two Small Fish Ventures as an operating partner, I work with founders across our portfolio. I am no longer the operator inside the company, but I get to be their sounding board — helping them tackle some of the biggest challenges they face.

Let’s be honest: they call me especially when they believe I am the only one who can help them. Their words, not mine. And there have been plenty of those occasions.

That gives me the same hit of adrenaline I used to get from operating. At my core, I love solving hard problems. That part of me did not go away after my exit. I just found a new arena for it — and it is a perfect replacement.

A Playground for a Science Nerd

What people may not realize is that the deep tech VC job is drastically different from a “normal” VC job. As a deep tech VC, I am constantly stretched and go deep — technically, intellectually, and creatively. It forces me to stay sharp, push my boundaries, and reconnect with my roots as a curious, wide-eyed science nerd.

There is something magical about working with founders at the bleeding edge of innovation. I get to dive into breakthrough technologies, understand how they work, and figure out how to turn them into usable and scalable products. It feels like being a kid in a candy store — except the candy is semiconductors, control systems, power electronics, quantum, and other domains in the next frontier of computing.

How could I not love that?

Ironically, I had less time to indulge this curiosity when I was a CEO. Now I can geek out and help shape the future at the same time. It is a net positive to me.

You Do Not Have to Love It All

Of course, every job — including CEO and VC — has its less glamorous parts. Whether you are a founder or a VC, there will always be administrative tasks and responsibilities you would rather skip.

But I have learned not to resent them. As I often say:

“You do not need to love every task. You just need to be curious enough to find the interesting angles in anything.”

Those tasks are the cost of admission to being a deep tech VC. A small price to pay to do the work I love — supporting incredible entrepreneurs as they bring transformative ideas to life, and finding joy in doing so. And knowing what I know now, I do not think I would enjoy being a “normal” VC. I cannot speak for others, but for me, this is the only kind of venture work that truly energizes and fulfills me.

A New Season. A New Purpose.

So yes, being a VC brings me as much joy — and arguably even more fulfillment (and I am surprised that I am saying this) — than being a CEO. I feel incredibly lucky. And I am all in.

It feels like all my past experience has prepared me for what I do today. I often describe this phase of my life this way:

Wattpad was my regular season. TSF is my playoff hockey.

It is faster. It is grittier. The stakes feel higher. Not because I am building one company, but because I am helping many shape the future.

P.S. Go Oilers!!

TSF Thesis 3.0: The Next Frontier of Computing and Its Applications Reshaping Large-Scale Behaviour

Summary

Driven by rapid advances in AI, the collapse in the cost of intelligence has arrived—bringing massive disruption and generational opportunities.

Building on this platform shift, TSF invests in the next frontier of computing and its applications, backing early-stage products, platforms, and protocols that reshape large-scale behaviour and unlock uncapped, new value through democratization. These opportunities are fueled by the collapsing cost of intelligence and, as a result, the growing demand for access to intelligence as well as its expansion beyond traditional computing devices. What makes them defensible are technology moats and, where fitting, strong data network effects.

Or more succinctly: We invest in the next frontier of computing and its applications, reshaping large-scale behaviour, driven by the collapsing cost of intelligence and defensible through tech and data moats.

Watch this 2-minute video to learn more about our approach:


Our Evolution: From Network Effects to Deep Tech

When we launched TSF in 2015, our initial thesis centred around network effects. Drawing from our experience scaling Wattpad from inception to 100 million users, we became experts in understanding and leveraging exponential value and defensibility created by network effects at scale. This expertise led us to invest—most as the very first cheque—in massively successful companies such as BenchSciAdaPrintify, and SkipTheDishes.

We achieved world-class success with this thesis, but like all good things, that opportunity diminished over time.

Our thesis evolved as the ground shifted toward the end of 2010s. A couple of years ago, we articulated this evolution by focusing on early-stage products, platforms, and protocols that transform user behaviour and empower businesses and individuals to unlock new value. Within this broad focus, we zoomed in specifically on three sectors: AI, decentralized protocols, and semiconductors. That thesis guided investments in great companies such as StoryIdeogramZinite, and Blumind.

But the world doesn’t stand still. In fact, it has never changed so rapidly. This brings us to the next and even more significant shift shaping our thesis.


A New Platform Shift: The Cost of Intelligence is Collapsing

Reflecting on the internet era, the core lesson we learned was that the internet was the first technology in human history that was borderless, connected, ubiquitous, real-time, and free. At its foundation was connectivity, and as “the cost of connectivity” steadily declined, productivity and demand surged, creating a virtuous cycle of opportunities.

The AI era shows remarkable parallels. AI is the first technology capable of learning, reasoning, creativity, cross-domain functionality, and decision-making. Like connectivity in the internet era, “the cost of intelligence” is now rapidly declining, while the value derived from intelligence continues to surge, driving even greater demand.

This shift will create massive economic value, shifting wealth away from many incumbents and opening substantial investment opportunities. However, just like previous platform shifts, the greatest opportunities won’t come from digitizing or automating legacy workflows, but rather from completely reshaping workflows and user behaviour, democratizing access, and unlocking previously impossible value. These disruptive opportunities will expand into adjacent areas, leaving incumbents defenceless as the rules of the game fundamentally change.


Intelligence Beyond Traditional Computing Devices

AI’s influence now extends far beyond pre-programmed software on computing devices. Machines and hardware are becoming intelligent, leveraging collective learning to adapt in real-time, with minimal predefined instruction. As we’ve stated before, software alone once ate the world; now, software and hardware together consume the universe. The intersection of software and hardware is where many of the greatest opportunities lie.

As AI models shrink and hardware improves, complex tasks run locally and effectively at the edge. Your phone and other edge devices are rapidly becoming the new data centres, opening exciting new possibilities.


Democratization and a New Lens on Defensibility

The collapse in the cost of intelligence has democratized everything—including software development—further accelerated by open-source tools. While this democratization unlocks vast opportunities, competition also intensifies. It may be a land grab, but not all opportunities are created equal. The key is knowing which “land” to seize.

Historically, infrastructure initially attracts significant capital, as seen in the early internet boom. Over time, however, much of the economic value tends to shift from infrastructure to applications. Today, the AI infrastructure layer is becoming increasingly commoditized, while the application layer is heavily democratized. That said, there are still plenty of opportunities to be found in both layers—many of them truly transformative. So, where do we find defensible, high-value opportunities?

Our previous thesis identified transformative technologies that achieved mass adoption, changed behaviour, democratized access, and unlocked unprecedented value. This framework remains true and continues to guide our evaluation of “100x” opportunities.

This shift in defensibility brings us to where the next moat lies.


New Defensibility: Deep Tech Meets Data Network Effects

Defensibility has changed significantly. In recent years, the pool of highly defensible early-stage shallow tech opportunities has thinned considerably, with far fewer compelling opportunities available. As a result, we have clearly entered a golden age of deep tech. AI democratization provides capital-efficient access to tools that previously required massive budgets. Our sweet spot is identifying opportunities that remain difficult to build, ensuring they are not easily replicated.

As “full-spectrum specialists,” TSF is uniquely positioned for this new reality. All four TSF partners are engineers and former startup leaders before becoming investors, with hands-on experience spanning artificial intelligence, semiconductors, robotics, photonics, smart energy, blockchain and others. We are not just technical; we are also product people, having built and commercialized cutting-edge innovations ourselves. As a guiding principle, we only invest when our deep domain expertise can help startups scale effectively and rapidly cement their place as future industry-disrupting giants.

Moreover, while traditional network effects have diminished, AI has reinvigorated network effects, making them more potent in new ways. Combining deep tech defensibility with strong data-driven network effects is the new holy grail, and this is precisely our expertise.


What We Don’t Invest In

Although we primarily invest in “bits,” we will also invest in “bits and atoms,” but we won’t invest in “atoms only.” We also have a strong bias towards permissionless innovations, so we usually stay away from highly regulated or bureaucratic verticals with high inertia. Additionally, since one of our guiding principles is to invest only when we have domain expertise in the next frontier of computing, we won’t invest in companies whose core IP falls outside of our computing expertise. We also avoid regional companies, as we focus on backing founders who design for global scale from day one. We invest globally, and almost all our breakout successes such as Printify have users and customers around the world.


Where We’re Heading

Having recalibrated our thesis for this new era, here’s where we’re going next.

We have backed amazing deep tech founders pioneering AI, semiconductors, robotics, photonics, smart energy, and blockchain—companies like FibraBlumindABRAxiomaticHepzibahStoryPoppy, and Viggle—across consumer, enterprise, and industrial sectors. With the AI platform shift underway, many new and exciting investment opportunities have emerged. 

The ground has shifted: the old playbook is out, the new playbook is in. It’s challenging, exciting, and we wouldn’t have it any other way.

To recap our core belief, TSF invests in the next frontier of computing and its applications, backing early-stage products, platforms, and protocols that reshape large-scale behaviour and unlock uncapped, new value through democratization. These opportunities are fueled by the collapsing cost of intelligence and, as a result, the growing demand for access to intelligence as well as its expansion beyond traditional computing devices. What makes them defensible are technology moats and, where fitting, strong data network effects.

Or more succinctly: We invest in the next frontier of computing and its applications, reshaping large-scale behaviour, driven by the collapsing cost of intelligence and defensible through tech and data moats.

So, if you’ve built interesting deep tech in the next frontier of computing, we invest globally and can help you turn it into a product. If you have a product, we can help you turn it into a massively successful business. If this sounds like you, reach out

Together, we will shape the future.

P.S. Please also read our blog post Five Areas Shaping the Next Frontier.

Eva + Allen + Brandon + Albert + Mikayla

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Gensee AI

A solo musician doesn’t need a conductor. Neither does a jazz trio.

But an orchestra? That’s a different story. You need a conductor to coordinate, to make sure all the parts come together.

Same with AI agents. One or two can operate fine on their own. But in a multi-agent setup, the real bottleneck is orchestration.

Yesterday, we announced our investment in GenseeAI. That’s the layer the company is building—the conductor for AI agents, i.e. the missing intelligent optimization layer for AI agents and workflows. Their first product, Cognify, takes AI workflows built with frameworks like LangChain or DSPy and intelligently rewrites them to be 10× faster, cheaper, and more reliable. It’s a bit like “compilation” for AI. Given a high-level workflow, Cognify produces a tuned, executable version optimized for production. Their second product, currently under development, goes one step further: a serving layer that continuously optimizes AI agents and workflows at runtime. Think of it as an intelligent “virtual machine” for AI, where the execution of agents and workflows is transparently and “automagically” improved while running.

If you’re building AI systems and want to go from prototype to production with confidence, get in touch with the GenseeAI team.

Read Brandon‘s blog post here or in the following for all the details:

At Two Small Fish, we invest in founders building foundational infrastructure for the AI-native world. We believe one of the most important – yet underdeveloped – layers of this stack is orchestration: how generative AI workflows are built, optimized, and deployed at scale.

Today, building a production-grade genAI app involves far more than calling an LLM. Developers must coordinate multiple steps – prompt chains, tool integrations, memory, RAG, agents – across a fragmented and fast-moving ecosystem and a variety of models. Optimizing this complexity for quality, speed, and cost is often a manual, lengthy process that businesses must navigate before a demo can become a product.

GenseeAI is building the missing optimization layer for AI agents and workflows in an intelligent way. Their first product, Cognify, takes AI workflows built with frameworks like LangChain or DSPy and intelligently rewrites them to be faster, cheaper, and better. It’s a bit like “compilation” for AI: given a high-level workflow, Cognify produces a tuned, executable version optimized for production. 

Their second product–currently under development–goes one step further: a serving layer that continuously optimizes AI agents and workflows at runtime. Think of it as an intelligent “virtual machine” for AI: where the execution of agents and workflows is transparently and automatically improved while running.

We believe GenseeAI is a critical unlock for AI’s next phase. Much of today’s genAI development is stuck in prototype purgatory – great demos that fall apart in the real world due to cost overruns, latency, and poor reliability. Gensee helps teams move from “it works” to “it works well, and at scale.”

What drew us to Gensee was not just the elegance of the idea, but the clarity and depth of its execution. The company is led by Yiying Zhang, a UC San Diego professor with a strong track record in systems infrastructure research, and Shengqi Zhu, an engineering leader who has built and scaled AI systems at Google. Together, they bring a rare blend of academic rigor and hands-on experience in deploying large-scale infrastructure. In early benchmarks, Cognify delivered up to 10× cost reductions and 2× quality improvements – all automatically. Their roadmap – including fully automated optimization, enterprise integrations, and a registry of reusable “optimization tricks” – shows ambition to become the default runtime for generative AI.

As the AI stack matures, we believe Gensee will become a foundational layer for organizations deploying intelligent systems. It’s the kind of infrastructure that quietly powers the AI apps we’ll all use – and we’re proud to support them on that journey.
If you’re building AI systems and want to go from prototype to production with confidence, get in touch with the team at GenseeAI.

Written by Brandon

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Perhaps My Title Should Be…Yoda?

Yesterday was Star Wars Day — aka “May the Fourth be with you” — and it got me thinking, so I put together this blog post.

You might notice my title is “Operating Partner,” not “General Partner,” “Managing Partner,” or “Board Partner.” That’s intentional because I spend most of my time working directly with portfolio CEOs.

The Operating Partner role has its roots in private equity. Historically, Operating Partners are often former CEOs or COOs who use their experience to guide leadership teams, improve operational execution, and drive results, ultimately increasing the value of portfolio companies.

As far as I know, I’m the only former scale-up CEO in Canada who plays this role in an early-stage VC. At least, ChatGPT and Perplexity couldn’t find anyone else! Even in the U.S., this is very rare.

That said, I’ve always felt the “Operating Partner” title is a bit misleading. Unlike many private equity Operating Partners, I don’t step into full-time or part-time leadership roles within portfolio companies. I don’t give advice or directives either. Instead, I help CEOs solve their own problems rather than solving problems for them.

My single objective is to help portfolio CEOs improve the quality of their decisions by leveraging my experience.

Why? Most CEOs don’t need to be told what to do—they already know. Telling a CEO to grow their KPIs faster or hire great people is useless.

No CEO intentionally grows slower or hires bad people!

The real challenge for CEOs isn’t the what—it’s the how. This is where I come in, helping them navigate the how: strategic thinking, future-proofing, and decision-making that drive tangible progress, while staying alert to blind spots that could undermine success.

Hiring is an example. Many venture firms have talent partners who assist portfolio companies with recruitment. These partners, often from recruitment backgrounds, are excellent at sourcing candidates once roles are defined. However, they usually lack deep business context and may not fully understand the culture of the companies they’re supporting. This can result in untargeted candidates who don’t fit. I experienced this issue firsthand when I was a CEO.

That’s why I strongly favour internal recruiters who have an intimate understanding of the business and culture. Even so, recruiters typically get involved after roles are clearly defined. Before that, to design the organization, we need someone who has visibility into the broader perspective of the business. Only one person truly has it: the CEO. Besides, CEOS usually can’t ask their leaders about organizational design for obvious reasons.

That’s where I step in—well before recruiters are involved. I act as a sounding board for organizational design, considering not just immediate hiring needs but also how roles and teams will evolve over time. What level of talent should they hire now? When will this position need to level up? What downstream implications will these decisions have?

By addressing these questions early, I help ensure hiring decisions are aligned with the company’s long-term strategy and culture.

Of course, hiring is just one area where I provide support. Design future-proof stock option plans? Manage internal and external communication challenges? Interact with strategic conglomerates? Navigate inbound acquisition offers? Resolve leadership dysfunction? Handle unreasonable investors? Make board meetings more effective? Fend off super aggressive competitors or internet giants?

And yes, one of the most frequent requests I get is: “Can you help me with my pitch deck?”

Bring them on!

I’ve faced these challenges firsthand multiple times, and when CEOs bring them to me, I’m ready to share my war scars.

At the minimum, I help narrow the options from “I don’t know how” to a set of multiple choices. I don’t make decisions for CEOs; I help them make better ones. They are ultimately responsible for their decisions, and I see my role as a guide, not a decision-maker.

Being the CEO of a fast-scaling company is an enormous challenge that people should not underestimate—the level of experience, capacity, intensity, and mental strength that one needs to cope with. That’s why it is the loneliest job. Empathy is not enough. The best help I ever got was from a more experienced CEO than me at the time — someone who had walked the road ahead — and now it’s my turn to pay it forward. It is payback time for me.

The more I think about it, the less “Operating Partner” seems to fit. I don’t step into the spotlight or take over operations. My role is more like Yoda—helping Skywalker fight the battles while staying behind the scenes.

So perhaps my title shouldn’t be Operating Partner after all. Maybe it should just be… Yoda.

May the Force be with you!

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Network Effect is Dead. Long Live Network Effect.

When Two Small Fish first started in 2015, we formulated our “Thesis 1.0” to focus on network effects exclusively. We leveraged our hands-on product experience in scaling Wattpad from 0 to 100 million users—essentially a marketplace for readers and writers—and applied a similar lens to other verticals, both in B2C and B2B.

It worked incredibly well for TSF because, at the time, network effects were the holy grail for defensibility, yet they were often misunderstood (for example, going viral is not the same as having network effects, and simply operating a marketplace does not guarantee strong network effects!). Our skill is more transferable than you might think!

So, Eva created the ASSET framework, which helped us identify the best network-effect investment opportunities and, more importantly, helped entrepreneurs understand and increase their network effect coefficient—the measure of true network effects—and ultimately embed strong network effects into their products. In short:

A stands for “atomic unit”

S stands for “seed the supply side”

• The other S stands for “scale the demand side”

E stands for “enlarge the network effect” or “enhance the network coefficient”

T stands for “track proprietary insights”

This framework provided a simple yet systematic way to judge whether a company truly had network effects or merely the illusion of them.

However, toward the end of the last decade, it became increasingly difficult to find investable network-effect opportunities. Well-established incumbents already had very strong network effects in place, effectively setting the world order. It became exceedingly difficult for emerging disruptors—both in consumer and enterprise spaces—to find a gap to break through.

We began looking for other forms of technology defensibility (for example, semiconductors) and gradually moved away from “shallow tech” network-effect investments, as we found very few investable opportunities. In fact, our last shallow tech investment was made about three years ago.

Then, in late 2022, ChatGPT arrived.

As the world now understands, generative AI is the first technology in human history capable of learning, reasoning, creativity, cross-domain functionality, and decision-making. It’s the most significant platform shift since mobile, social, and cloud computing in the late 2010s—and arguably the biggest one in human history. It also means the playing field has been leveled. Today, there are numerous ways to create new products with powerful network effects that can render incumbents’ offerings obsolete (for example, I haven’t used Google Search regularly for a long time) because newcomers can disrupt incumbents from all three angles: technology, product, and commercialization (e.g., business models). Incumbents are vulnerable!

On the other hand, the ASSET framework also needed a refresh, as we’re no longer dealing with simple, well-understood marketplaces. What if one side of the marketplace is now AI? Even though our original framework was designed to handle data-driven network effects, the speed and scale of data generation have multiplied by orders of magnitude. How does this affect enlarging the network effects and increasing the coefficient?

The good news is that there are now ways to massively increase the network effect coefficient in a remarkably short time. The bad news is that all your competitors—large or small—can do the same. Competition has never been fiercer.

After ChatGPT was released, we quickly revised our ASSET framework to version 2.0. Since then, we’ve been guest-lecturing this masterclass worldwide for well over a year. By fully leveraging AI’s creativity and reasoning capabilities, entrepreneurs can now harness human-machine collaboration to supercharge both the demand and supply sides, blitz-scale, and create new atomic units. Here’s the gist of 2.0:

A – Atomic Unit of Product

S – Super Seed the Supply Side (now amplified by Gen AI)

S – Supercharge the Demand Side (now leveraging Gen AI)

E – Exponential Engagement (using the human + AI combo)

T – Transform Business with New AI-powered Atomic Units

Like 1.0, this new framework is easy to understand but difficult to master—and it’s even more complex now because, with Gen AI, it’s non-linear. Our masterclass covers the lecture material, but the real work happens in our private tutoring, where execution matters—and this is how we help our portfolio companies win.

The old network effect is dead. Thanks to the AI platform shift, network effects are roaring back in a different and far more potent way in the new world order. The combination of deep tech defensibility plus network effect defensibility is the new holy grail—and we are specialized in both.

With the AI platform shift, all of a sudden, there are many new investable opportunities that didn’t exist before. At the same time, the ground has shifted: the old playbook is out, and the new playbook is in. It’s exciting; we love the challenge, and we wouldn’t have it any other way.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Investing in Fibra: Revolutionizing Women’s Health with Smart Underwear

At Two Small Fish Ventures, we love backing founders who are not only transforming user behaviour but also unlocking new and impactful value. That’s why we’re excited to announce our investment in Fibra, a pioneering company redefining wearable technology to improve women’s health. We are proud to be the lead investor in this round, and I will be joining as a board observer. 

The Vision Behind Fibra

Fibra is developing smart underwear embedded with proprietory textile-based sensors for seamless, non-invasive monitoring of previously untapped vital biomarkers. Their innovative technology provides continuous, accurate health insights—all within the comfort of everyday clothing. Learning from user data, it then provides personalized insights, helping women track, plan, and optimize their reproductive health with ease. This AI-driven approach enhances the precision and effectiveness of health monitoring, empowering users with actionable information tailored to their unique needs. 

Fibra has already collected millions of data points with its product, further strengthening its AI capabilities and improving the accuracy of its health insights. While Fibra’s initial focus is female fertility tracking, its platform has the potential to expand into broader areas of women’s health, including pregnancy detection/monitoring, menopause, detection of STDs and cervical cancer and many more, fundamentally transforming how we monitor and understand our bodies.

Perfect Founder-Market Fit

Fibra was founded by Parnian Majd, an exceptional leader in biomedical innovation. She holds a Master of Engineering in Biomedical Engineering from the University of Toronto and a Bachelor’s degree in Biomedical Engineering from TMU. Her achievements have been widely recognized, including being an EY Women in Tech Award recipient, a Rogers Women Empowerment Award finalist for Innovation, and more.

We are thrilled to support Parnian and the Fibra team as they push the boundaries of AI-driven smart textiles and health monitoring. We are entering a golden age of deep-tech innovation and software-hardware convergence—a space we are excited to champion at Two Small Fish Ventures.

Stay tuned as Fibra advances its mission to empower women through cutting-edge health technology.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Announcing Our Investment in Hepzibah AI

The Two Small Fish team is thrilled to announce our investment in Hepzibah AI, a new venture founded by Untether AI’s co-founders, serial entrepreneurs Martin Snelgrove and Raymond Chik, along with David Lynch and Taneem Ahmed. Their mission is to bring next-generation, energy-efficient AI inference technologies to market, transforming how AI compute is integrated into everything from consumer electronics to industrial systems. We are proud to be the lead investor in this round, and I will be joining as a board observer to support Hepzibah AI as they build the future of AI inference.

The Vision Behind Hepzibah AI

Hepzibah AI is built on the breakthrough energy-efficient AI inference compute architecture pioneered at Untether AI—but takes it even further. In addition to pushing performance/power harder, it can handle training loads like distillation, and it provides supercomputer-style networking on-chip. Their business model focuses on providing IP and core designs that chipmakers can incorporate into their system-on-chip designs. Rather than manufacturing AI chips themselves, Hepzibah AI will license its advanced AI inference IP for integration into a wide variety of devices and products.

Hepzibah AI’s tagline, “Extreme Full-stack AI: from models to metals,” perfectly encapsulates their vision. They are tackling AI from the highest levels of software optimization down to the most fundamental aspects of hardware architecture, ensuring that AI inference is not only more powerful but also dramatically more efficient.

Why does this matter? AI is rapidly becoming as indispensable as the CPU has been for the past few decades. Today, many modern chips, especially system-on-chip (SoC) devices, include a CPU or MCU core, and increasingly, those same chips will require AI capabilities to keep up with the growing demand for smarter, more efficient processing.

This approach allows Hepzibah AI to focus on programmability and adaptable hardware configurations, ensuring they stay ahead of the rapidly evolving AI landscape. By providing best-in-class AI inference IP, Hepzibah AI is in a prime position to capture this massive opportunity.

An Exceptional Founding Team

Martin Snelgrove and Raymond Chik are luminaries in this space—I’ve known them for decades. David Lynch and Taneem Ahmed also bring deep industry expertise, having spent years building and commercializing cutting-edge silicon and software products.

Their collective experience in this rapidly expanding, soon-to-be ubiquitous industry makes investing in Hepzibah AI a clear choice. We can’t wait to see what they accomplish next.

P.S. You may notice that the logo is a curled skunk. I’d like to highlight that the skunk’s eyes are zeros from the MNIST dataset. 🙂 

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Contrarian Series: Your TAM is Zero? We love it!

Note: One of the most common pieces of feedback we receive from entrepreneurs is that TSF partners don’t think, act, or speak like typical VCs. The Contrarian Series is meant to demystify this, so founders know more about us before pitching.

Just before New Year, I was speaking at the TBDC Venture Day Conference together with BetaKit CEO Siri Agrell and Serial Entrepreneur and former MP Frank Baylis.

When I said “Two Small Fish love Zero TAM businesses,” I said it so matter-of-factly that the crowd was taken aback. I even saw quite a few posts on social media that said, “I can’t believe Allen Lau said it!”

Of course, any business will need to go after a non-zero TAM eventually. But hear me out.

Here’s what I did at Wattpad: I never had a “total addressable market” slide in the early days. I just said, “There are five billion people who can read and write, and I want to capture them all!”

Even when we became a scaleup, I kept the same line. I just said, “There are billions of people who can read, write, or watch our movies, and I want to capture them all!”

Naturally, some VCs tried to box me into the “publishing tool” category or other buckets they deemed appropriate. But Wattpad didn’t really fit into anything that existed at the time. Trust me, I tried to find a box I would fit in too, but none felt natural.

Why? That’s because Wattpad was a category creator. And, of course, that meant our TAM was effectively zero.

In other words, we made our own TAM.

Many of our portfolio companies are also category creators, so their decks often don’t have a TAM slide either.

Yes, any venture-backed company eventually needs a large TAM. And, of course, I don’t mean to suggest that every startup needs to be a category creator.

That said, we’re perfectly fine—in fact, sometimes we even prefer—seeing a pitch deck without a TAM slide. By definition, category creators have first-mover advantages. More importantly, category creators in a large, winner-take-all market—especially those with strong moats—tend to be extremely valuable at scale and, hence, highly investable.

So, founders, if your company is poised to create a large category, skip the TAM slide when pitching to Two Small Fish. We love it!

P.S. Don’t forget, if you have an “exit strategy” slide in your pitch deck, please remove it before pitching to us. TYSM!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Celebrating the Unintended but Obvious Impact of Wattpad on International Women’s Day

It’s been almost three years since I stepped aside from my role as CEO of Wattpad, yet I’m still amazed by the reactions I get when I bump into people who have been part of the Wattpad story. The impact continues to surface in unexpected and inspiring ways frequently.

Wattpad has always been a platform built on storytelling for all ages and genders. That being said, our core demographic—roughly 50% of our users—has been teenage girls. Young women have always played a pivotal role in the Wattpad community.

Next year, Wattpad will turn 20 (!)—a milestone that feels both surreal and deeply rewarding. When we started in 2006, we couldn’t have imagined the journey ahead. But one thing is certain: our early users have grown up, and many of them are now in their 20s and 30s, making their mark on the world in remarkable ways.

A perfect example: at our recent masterclass at the University of Toronto, I ran into Nour. A decade ago, she was pulling all-nighters reading on Wattpad. Today, she’s an Engineering Science student at the University of Toronto, specializing in machine intelligence. Her story is not unique. Over the years, I’ve met countless female Wattpad users who are now scientists, engineers, and entrepreneurs, building startups and pushing boundaries in STEM fields.

This is incredibly fulfilling. Many of them have told me that they looked up to Wattpad and our journey as a source of inspiration. The idea that something we built has played even a small role in shaping their ambitions is humbling.

Now, as an investor at Two Small Fish, I’m excited about the prospect of supporting these entrepreneurs in the next stage of their journey. Some of these Wattpad users will go on to build the next great startups, and it would be incredible to be part of their success, just as they were part of Wattpad’s.

On this International Women’s Day, I want to celebrate this unintended but, in hindsight, obvious outcome: a generation of young women who grew up on Wattpad are now stepping into leadership roles in tech and beyond. They are the next wave of innovators, creators, and entrepreneurs, and I can’t wait to see what they build next.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

After All, What’s Deep Tech?

“Deep Tech” is one of those terms that gets thrown around a lot in venture capital and startup circles, but defining it precisely is harder than it seems. If you check Wikipedia, you’ll find this:

Deep technology (deep tech) or hard tech is a classification of organization, or more typically a startup company, with the expressed objective of providing technology solutions based on substantial scientific or engineering challenges. They present challenges requiring lengthy research and development and large capital investment before successful commercialization. Their primary risk is technical risk, while market risk is often significantly lower due to the clear potential value of the solution to society. The underlying scientific or engineering problems being solved by deep tech and hard tech companies generate valuable intellectual property and are hard to reproduce.

At a high level, this definition makes sense. Deep tech companies tackle hard scientific and engineering problems, create intellectual property, and take time to commercialize. But what do substantial scientific or engineering challenges actually mean? Specifically, what counts as substantial? “Substantial” is a vague word. A difficult or time-consuming engineering problem isn’t necessarily a deep tech problem. There are plenty of startups that build complex technology but aren’t what I’d call deep tech. It’s about tackling problems where existing knowledge and tools aren’t enough.

In 1964, Supreme Court Justice Potter Stewart famously said, “I know it when I see it” when asked to describe his test for obscenity in Jacobellis v. Ohio. By no means am I comparing deep tech to obscenity—I don’t even want to put these two things in the same sentence. However, there is a parallel between the two: they are both hard to put into a strict formula, but experienced technologists like us recognize deep tech when we see it.

So, at Two Small Fish, we have developed our own simple rule of thumb:

If we see a product and say, “How did they do that?” and upon hearing from the founders how it is supposed to work, we still say, “Team TSF can’t build this ourselves in 6–12 months,” then it’s deep tech.

At TSF, we invest in the next frontier of computing and its applications. We’re not just looking for smart founders. We’re looking for founders who see things others don’t—who work at the edge of what’s possible. And when we find them, we know it when we see it.

This test has been surprisingly effective. Every single investment we’ve made in the past few years has passed it. And I expect it will continue to serve us well.

P.S. If you enjoyed this blog post, please take a minute to like, comment, subscribe and share. Thank you for reading!

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Welcoming Albert Chen as a Venture Partner at Two Small Fish Ventures

Today’s blog post is written by Eva and is a reblog of what was originally shared on the Two Small Fish Ventures website.

We are thrilled to announce that Albert Chen is joining Two Small Fish as a Venture Partner!

Albert brings a wealth of experience to our team. Like all our partners at TSF, Albert’s expertise spans the full spectrum—from technical innovation to product development to operational leadership in entrepreneurial startups. His impressive academic background further underscores his exceptional capabilities.

Albert earned his Ph.D. in BioMEMS, Acoustics, and Medical Engineering from the University of Waterloo. He also completed his undergraduate studies in Systems Design Engineering at Waterloo and participated in an international exchange program in Electrical Engineering at National Taiwan University.

Albert’s professional career is equally remarkable. Most notably, he served as the CTO of robotics and edge AI company Forcen. His diverse experience also includes roles at Metergy (smart energy), Excelitas (photonics), and North (smart glass, acquired by Google).

This is just a glimpse of Albert’s impressive journey. Follow him on LinkedIn to learn more about his background and accomplishments.

At Two Small Fish Ventures, we are committed to supporting bold founders shaping the future of technology through our experience. Albert’s extensive academic background, combined with his hands-on leadership and innovation experience, makes him an invaluable addition to our team.

Please join us in welcoming Albert to Two Small Fish!

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Two Small Fish Honoured to Be on the CVCA Top 50 List

Who are the top 50 VCs in Canada? Two Small Fish Ventures is one of them! At Two Small Fish Ventures, we are deeply honoured to be named among Canada’s top 50 venture capital firms in this year’s edition of The 50 — the annual guide produced by the Canadian Venture Capital & Private Equity Association (CVCA) and the Trade Commissioner Service (TCS).

This recognition is not just a badge for us; it’s a reflection of the thriving and globally respected Canadian venture ecosystem we are proud to be part of. We share this honour with an incredible group of firms that are shaping the future of technology, science, and innovation across the country and beyond.

If you are an entrepreneur, this list represents the Canadian VCs you should talk to — firms committed to partnering with visionary founders, pushing boundaries, and building category-defining companies.

We look forward to continuing to back the next generation of transformational founders and are grateful to the CVCA and TCS for this spotlight.

The Full List: Canada’s Top 50 VCs

Here’s the full list of the firms recognized this year (in alphabetical order):

1. Active Impact Investments

2. Amplify Capital

3. Amplitude Ventures

4. AQC Capital

5. BrandProject

6. Brilliant Phoenix

7. Conexus Venture Capital

8. CTI Life Sciences Fund

9. Diagram Ventures

10. Finchley Healthcare Ventures

11. First Ascent Ventures

12. Framework Venture Partners

13. Genesys Capital

14. Good News Ventures

15. Graphite Ventures

16. Greensoil PropTech Ventures

17. GreenSky Ventures

18. iGan Partners

19. Inovia Capital

20. INP Capital

21. InvestEco

22. Luge Capital

23. Lumira Ventures

24. MKB

25. McRock Capital

26. NGIF

27. Panache Ventures

28. Pelorus VC

29. Portage

30. Radical Ventures

31. Raven Indigenous Capital Partners

32. Real Ventures

33. Relay Ventures

34. Renewal Funds

35. Saltagen

36. Sandpiper Ventures

37. Sectoral Asset Management

38. Staircase Ventures

39. SVG Ventures | THRIVE

40. The51 Ventures

41. Two Small Fish Ventures

42. Vanedge Capital

43. Version One Ventures

44. Vistara Growth

45. White Star Capital

46. Whitecap Venture Partners

47. Yaletown Partners

48. Evok Innovations

49. Cycle Capital

50. Boreal Ventures

Portfolio Highlight: ABR

The next frontier of AI lies at the edge — where data is generated. By moving AI toward the edge, we unlock real-time, efficient, and privacy-focused processing, opening the door to a wave of new opportunities. One of our most recent investments, Applied Brain Research (ABR), is leading this revolution by bringing “cloud-level” AI capabilities to edge devices.

Why is this important? Billions of power-constrained devices require substantial AI processing. Many of these devices operate offline (e.g., drones, medical devices, and industrial equipment), have access only to unreliable, slow, or high-latency networks (e.g., wearables and smart glasses), or must process data streams in real time (e.g., autonomous vehicles). Due to insufficient on-device capability, the only solution today is to send data to the cloud — a suboptimal or outright infeasible approach.

How does ABR solve this? ABR’s groundbreaking technology addresses these challenges by delivering “cloud-sized” high-performance AI on compact, ultra-low-power devices. This shift is transforming industries such as consumer electronics, healthcare, automotive, and a range of industrial applications, where latency, reliability, energy efficiency, and localized intelligence are essential.

What is ABR’s secret sauce? ABR’s unique approach is rooted in computational neuroscience. Co-founded by Dr. Chris Eliasmith, CTO and Head of the University of Waterloo’s Computational Neuroscience Research Group, ABR leverages a brain-inspired invention called the Legendre Memory Unit (LMU), which was invented by Dr. Eliasmith and his team of researchers. LMUs are provably optimal for compressing time-series data—like voice, video, sensor data, and bio-signals—enabling significant reductions in memory usage. Running the

LMU on ABR’s unique processor architecture has created a breakthrough that “kills three birds with one stone” by:

1. Increasing performance,

2. Reducing power consumption by up to 200x, and

3. Cutting costs by 10x.

This is further turbocharged by ABR’s AI toolchain, which enables customers to deploy solutions in weeks instead of months. Time is money, and ABR’s technology allows for advanced on-device functions—like natural language processing—without relying on the cloud. This unlocks entirely new use cases and possibilities.

At the helm of ABR is Kevin Conley, the CEO and a former CTO of SanDisk, alongside Dr. Chris Eliasmith. Together, they bring exceptionally strong leadership across both hardware and software domains—a rare but powerful combination that gives ABR a significant competitive advantage.

ABR’s vision aligns perfectly with our investment thesis and our belief that edge computing and software-hardware convergence represent the next frontier of opportunity in computing. We’re excited to see ABR power billions of devices in the years to come.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Celebrating Professor Geoffrey Hinton’s Nobel Prize (and His Birthday)

In the past few days, Eva and I had the privilege of joining the University of Toronto delegation in Stockholm to celebrate University Professor Emeritus Geoffrey Hinton, the 2024 Nobel Laureate in Physics. The events, organized by the University, were a fitting tribute to Professor Hinton’s groundbreaking contributions to AI, a technology that will transform our world in the decades to come.

The celebration was a blend of thoughtful discussions, historic venues, and memorable moments. It all began with a birthday party for Professor Hinton, followed by a fireside chat, an inspiring dinner at the iconic Vasa Museum, and a panel exploring Canada’s leadership in AI at the Embassy of Canada to Sweden. Each event underscored not only Professor Hinton’s remarkable achievements but also the global impact of Canadian innovation in AI and technology more broadly.

Rather than recount every detail, I’ll let the pictures and their captions tell the story of this extraordinary week. It was an incredible opportunity for us to honour a visionary scientist.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Contrarian Series: Best Exit Strategy? Not Having One

Note: One of the most common pieces of feedback we receive from entrepreneurs is that TSF partners don’t think, act, or speak like typical VCs. The Contrarian Series is meant to demystify this, so founders know more about us before pitching.

For Wattpad, it was exactly ten years between raising our first round of venture capital in 2011 and the company’s acquisition in 2021. Over that decade, we discussed countless topics in our board meetings.

But one topic we never discussed? Exit strategies.

I distinctly remember, a couple of years before the acquisition, I raised the question to a board member. “We’ve been venture-backed for almost ten years now. Should we start talking about exit…”

I couldn’t even finish the sentence. That board member cut me off:

“Allen, I just want you to build a great company.”

That moment stuck with me. Only after the acquisition did I fully appreciate the significance of those ten years as a venture-backed company without focusing on an exit.

Wattpad’s four largest investors—USV, Khosla Ventures, OMERS, and Tencent—enabled us to focus on building the business, not selling it. OMERS, as a pension fund, and Tencent, as a strategic investor, don’t operate under the typical 10-year fund cycle that drives many venture firms to push for exits. USV, with its consistent track record of generating world-class returns, had the trust of its LPs to prioritize long-term value over short-term outcomes. And Khosla Ventures? Well, no one can tell Vinod Khosla what to do, and he loves making big, long-term bets.

Their perspectives freed us to focus on building a great company rather than prematurely worrying about how to sell it.

In early 2020, a year before Wattpad was acquired for US$660M, we set an ambitious company objective: to become “Investment Ready.” This meant ensuring we could scale profitably and confidently project $100M+ in revenue with a minimum of 40% year-over-year growth. By the end of 2020, we wanted to be in a position to choose between preparing for an IPO (we even reserved our ticker symbol WTPD), raising growth capital to accelerate expansion, or scaling organically without any additional funding.

When an inbound acquisition offer came in mid-2020, this optionality proved invaluable. It allowed us to run a proper process with multiple interested parties. We were clear with potential acquirers: our preference was to remain independent. If the offer wasn’t higher than the value we could command through an IPO, we weren’t interested, and we would walk away. Because we had the fundamentals to back it up, no one doubted us.

This underscores an important point: the best way to generate a great outcome is to build an amazing business. Focus on creating value, and optionality will follow.

Any CEO who claims to have an exit strategy—especially in the early stages—is either naïve, disillusioned, or lying.

Here’s the reality: M&A is far less common than people think. The pool of serious potential acquirers often narrows to just a handful in the best-case scenarios. And even then, the stars have to align—you need the right timing, the right strategic fit, and the right price. It’s easier said than done.

Of course, that doesn’t mean I ignored the idea of acquisition entirely (and founders should consider M&A, but only under the right circumstances, and I will save it for another blog post). For instance, we built relationships with potential strategic acquirers and stayed aware of the landscape. But the time I spent on this was minimal. Even my leadership team occasionally asked why I never talked about M&A. The answer was simple: it wasn’t a priority.

Too many founders overthink their “exit strategy,” and it often backfires. Changing their product to appeal to a potential acquirer? Building one-sided partnerships in the hope they’ll buy the company? Hope is not a strategy.

The same goes for VCs. Some overthink their portfolio companies’ “exit strategy” because they worry about selling before the 10-year fund window closes. While this concern is valid, it doesn’t mean they should push their best portfolio companies to sell. There are many ways for VCs to liquidate their positions without forcing a sale. Ironically, the best way for a founder to help their investors exit is to focus on increasing enterprise value. Shares in a great company are always in demand.

For an early-stage startup, having an exit strategy is as absurd as asking an infant to decide which jobs they’ll apply to after university. The founders’ job is to nurture that infant—raise them into a great human being. The results will follow.

Build a great business, and everything else will fall into place. There’s an old saying: Great companies get bought, not sold. It couldn’t be more true.

P.S. Founders, if you have an exit strategy slide in your pitch deck, please remove it before pitching to us. TYSM!

P.P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The Three Phases of Building a Great Tech Company: Technology, Product, and Commercialization

There are three distinct phases in the journey of building a great tech company: technology, product, and commercialization. These phases are sequential yet interconnected and sometimes overlap. Needless to say, mastering each is critical to the company’s eventual success. However, it’s important to recognize their differences.

• Building technology is about founders creating what they love. It’s driven by passion and expertise and often leads to groundbreaking innovations.

• Building a product is about creating something others love to use. This is where usability and solving real problems come into focus.

• Commercialization is about building something people will pay for and driving revenue. This phase transforms users into paying customers or finds someone else to pay for it, such as advertisers.

These phases are related but distinct. Great technology doesn’t guarantee anyone will use it, and a widely-used product doesn’t always lead to revenue. I’ve seen many technologists create incredible technologies no one adopts, as well as popular products that fail to commercialize effectively (though it’s rare for a product with tens of millions of users to fail entirely).

For deep tech companies, these phases often have minimal overlap and unfold sequentially. The technology might take years to develop before a usable product emerges, and commercialization may come even later.

In contrast, shallow tech B2B SaaS products often see complete overlap between the phases. For example, a subscription model is typically apparent from the outset, and the tech, product, and commercialization phases blend seamlessly.

Wattpad is also a good example of how these phases can play out differently. Initially, we built our technology and product hand in hand, creating a platform loved by millions of users. However, its commercialization—whether through ads, subscriptions, or movies, the three revenue models we had—was deliberately delayed. Many people assumed we didn’t know how to make money without understanding this counterintuitive approach (but of course, we purposely kept some of our strategies under wraps). This approach allowed us to use “free” as a potent weapon to dominate—and eliminate—our competitors in a winner-takes-all strategy. Operating for years with minimal revenue was clearly the right decision for the market dynamics and our long-term goals. More on this in a separate blog post.

Given this variability, asking, “What is your revenue?” must be thoughtful and context-specific. For some companies, the absence of revenue may be an intentional and brilliant strategy. For others, insufficient revenue could signal serious trouble. It all depends on the company’s stage, strategy, and goals. Understanding the sequence, timing, and specific needs of a business model is crucial for both investors and entrepreneurs. Zero revenue could be a blessing in the right context. On the other hand, pushing for revenue growth—let alone the wrong type of revenue growth—can be fatal, a scenario we’ve seen many times.

At Two Small Fish Ventures, we are very thoughtful and experienced investors. We understand that starting to generate revenue—or choosing not to generate revenue—at the right time is one of the secrets to success that very few people have mastered. We practise what we preach. Over the past two years, all but one of TSF’s investments have been pre-revenue.

No revenue? No problem. In fact, that’s great. Bring them on!

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Our Secret to Finding 100x Opportunities

In previous blog posts (here and here), I’ve delved into the mathematical model for constructing an early-stage VC portfolio designed to achieve outsized returns. In short, investing early to build a concentrated portfolio of fewer than 20 moonshot companies, each with the potential for 100x returns or more, is the way to go.

The math is straightforward—it doesn’t lie. Not adhering to this model can significantly reduce the likelihood of achieving exceptional returns.

However, simply following this model is not enough to guarantee outsized results. Don’t mistake correlation for causation! The real challenge lies in identifying, evaluating, and supporting these “100x” opportunities to help turn their vision into reality.

At TSF, we use a simple framework to evaluate whether a potential investment can meet the 100x criteria:

10x (early stage) x 10x (transformative behaviour) = 100x conviction

The first “10x” is straightforward: We invest when companies are in their earliest stages. For instance, over the past two years, all but one of TSF’s investments have been pre-revenue. This made financial analysis simple—those spreadsheets were filled with zeros!

Many of these companies are also pre-traction. While having traction isn’t a bad thing, savvy investors shouldn’t rely on it for validation. The reason is simple: traction is visible to everyone. By the time it becomes apparent, the company is often already too expensive and out of reach.

At TSF, we have a unique advantage. Before transitioning to investing, all TSF partners were engineers, product experts, successful entrepreneurs, and operators—including a “recovering CEO”—that’s me! Each partner brings distinct domain expertise, collectively creating a broad and deep perspective. This allows us to invest only when we possess the domain knowledge needed to fully evaluate an opportunity. We “open the hood” to determine whether the technology is genuinely unique, defensible, and disruptive, or whether it is easily replicable. If it’s the latter, we pass quickly. A strong, defensible tech moat is a key criterion for us. This approach means we might pass on some promising “shallow-tech” opportunities, but we’re very comfortable with that. After all, we believe the best days of shallow tech are behind us.

Maintaining a concentrated portfolio allows us to commit only to investments where we have unwavering conviction. In contrast, a large portfolio would require us to find a large number of 100x opportunities and pursue those we might not fully believe in. Frankly, I wouldn’t sleep well if we took that route. This route would also make it difficult to provide the meaningful, tailored support we’ve promised our entrepreneurs (more on that in a future post). 

When evaluating product potential, we look beyond the present. At TSF, we assess how a technology might reshape the landscape over the next decade or more. We start by understanding the intrinsic needs of the user and envision how a product could fundamentally change customer or end-user behaviour. This is crucial: if a product that addresses a massive opportunity has a strong tech moat, first-mover advantages, and the ability to change behaviour while facing few viable alternatives, it can unlock significant new value and create a defensible, category-defining business.

This often translates into substantial commercialization potential. If we can foresee how the product might evolve into adjacent markets (its second, third, or even fourth act) with almost uncapped possibilities, we achieve the “holy trinity” of tech-product-commercialization potential—forming the second 10x of our conviction.

Here’s how we describe it:

Two Small Fish Ventures invests in early-stage products, platforms, and protocols that transform user behaviour and empower businesses and individuals to unlock new, impactful value.

This thesis underpins our investment decisions and ensures that each choice we make aligns with our long-term vision for transformative innovation.

While this framework may sound simple, executing it well is extremely difficult. It requires what I call a “crystal ball” skill set that spans the full spectrum of entrepreneurial, technical, product, and operational backgrounds.

Over the past decade, we’ve built a portfolio of more than 50 companies across three funds. By employing this approach, the entrepreneurs we’ve supported have achieved numerous breakout successes. This post outlines our “secret sauce,” and we will continue to leverage it.

As you can see, early-stage VC is more art than science. To do it well requires thoughtfulness, insight, and the ability to envision the future as a superpower. It’s challenging but incredibly rewarding. I wouldn’t trade it for anything.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Fabless + ventureLAB is Cloud Computing for Semiconductors

This is a follow-up blog post to my last piece about Blumind.

More than two decades ago, before I started my first company, I was involved with an internet startup. Back then, the internet was still in its infancy, and most companies had to host their own servers. The upfront costs were daunting—our startup’s first major purchase was hundreds of thousands of dollars in Sun Microsystems boxes that sat in our office. This significant investment was essential for operations but created a massive barrier to entry for startups.

Fast forward to 2006 when we started Wattpad. We initially used a shared hosting service that cost just $5 per month. This shift was game-changing, enabling us to bootstrap for several years before raising any capital. We also didn’t have to worry about maintaining the machines. It dramatically lowered the barrier to entry, democratizing access to the resources needed to build a tech startup because the upfront cost of starting a software company was virtually zero.

Eventually, as we scaled, we moved to AWS, which was more scalable and reliable. Apparently, we were AWS’s first customer in Canada at the time! It became more expensive as our traffic grew, but we still didn’t have to worry about maintaining our own server farm. This significantly simplified our operations.

A similar evolution has been happening in the semiconductor industry for more than two decades, thanks to the fabless model. Fabless chip manufacturing allows companies—large or small—to design their semiconductors while outsourcing fabrication to specialized foundries. Startups like Blumind leverage this model, focusing solely on designing groundbreaking technology and scaling production when necessary.

But fabrication is not the only capital-intensive aspect. There is also the need for other equipment once the chips are manufactured.

During my recent visit to ventureLAB, where Blumind is based, I saw firsthand how these startups utilize shared resources for this additional equipment. Not only is Blumind fabless, but they can also access various hardware equipment at ventureLAB without the heavy capital expenditure of owning it.

Let’s see how the chip performs at -40C!
Jackpine (first tapeout)
Wolf (second tapeout)
BM110 (third tapeout)

The common perception that semiconductor startups are inherently capital-intensive couldn’t be more wrong. The fabless model—in conjunction with organizations like ventureLAB—functions much like cloud computing does for software startups, enabling semiconductor companies to build and grow with minimal upfront investment. For the most part, all they need initially are engineers’ computers to create their designs until they reach a scale that requires owning their own equipment.

Fabless chip design combined with shared resources at facilities like ventureLAB is democratizing the semiconductor space, lowering the barriers to innovation, and empowering startups to make significant advancements without the financial burden of owning fabrication facilities. Labour costs aside, the upfront cost of starting a semiconductor company like Blumind could be virtually zero too.

That’s why the saying, “software once ate the world alone; now, software and hardware consume the universe together,” is becoming true at an accelerated pace. We have already made several investments based on this theme, and we are super excited about the opportunities ahead.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: Blumind

When it comes to watches, my go-to is a Fitbit. It may not be the most common choice, but I value practicality, especially when not having to recharge daily is a necessity to me. My Fitbit lasts about 4 to 5 days—decent, but still not perfect.

Now, imagine if we could extend that battery life to a month or even a year. The freedom and convenience would be incredible. Considering the immense computing demands of modern smartwatches, this might sound far-fetched. But that’s where our portfolio company, Blumind, comes into play.

Blumind’s ultra-low power, always-on, real-time, offline AI chip holds the potential to redefine how we think about battery life and device efficiency. This advancement enables edge computing with extended battery life, potentially lasting years – not a typo – instead of days. Products powered by Blumind can transform user behaviours and empower businesses and individuals to unlock new and impactful value (see our thesis).

Blumind’s secret lies in its brain-inspired, all-analog chip design. The human brain is renowned for its energy-efficient computing abilities. Unlike most modern chips that rely on digital systems and require continuous digital-to-analog and analog-to-digital conversions (which drain power), Blumind’s approach emulates the brain’s seamless analog processing. This unique architecture makes it perfect for power-sensitive AI applications, resulting in chips that could be up to 1000 times more energy-efficient than conventional chips, making them ideal for edge computing.

Blumind’s breakthrough technology has practical and wide-ranging applications. Here are just a few use cases:

Always-on Keyword Detection: Integrates into various devices for continuous voice activation without excessive power usage.

Rapid Image Recognition: Supports always-on visual wake word detection for applications such as access control, enhancing human-device interaction with real-time responses.

Time-Series Data Processing: Processes data streams with exceptional speed for real-time analysis in areas like predictive maintenance, health monitoring, and weather forecasting.

These capabilities unlock new possibilities across multiple industries, including wearables, smart home technology, security, agriculture, medical, smart mobility, and even military and aerospace.

A few weeks ago, I visited Blumind’s team at their ventureLAB office and got an up-close look at their BM110 chip, now in its third tapeout. Blumind exemplifies the future of semiconductor startups through its fabless model, which significantly lowers the initial infrastructure costs associated with traditional semiconductor companies. With resources like ventureLAB supporting them, Blumind has managed to innovate with remarkable efficiency and sustainability. (I’ll share more about the fabless model in an upcoming post.)

I’m thrilled to see where Blumind’s journey leads and how its groundbreaking technology will transform daily life and reshape multiple industries. When devices can go years without needing a recharge instead of mere hours, that’s nothing short of game-changing.

Image: Close-up view of BM110. It is a piece of art!

Image: Qualification in action. Note that BM110 (lower-left corner) is tiny and space-efficient.

Image: The Blumind team is working hard at their ventureLAB office. More on this in a separate blog post here.

Our portfolio company, Blumind, is revolutionizing device efficiency with its ultra-low power, always-on, real-time, offline AI chip. Inspired by the human brain’s energy-efficient computing, Blumind’s innovative all-analog design significantly reduces power consumption, making its chips up to 1000 times more efficient than conventional digital chips. 

This advancement enables edge computing with extended battery life, potentially lasting YEARS - not a typo - instead of days. Practical applications of Blumind’s technology include always-on keyword detection for voice activation, rapid image recognition for access control, and real-time time-series data analysis for predictive maintenance and health monitoring. These capabilities unlock new and previously impossible opportunities across various industries, from wearables and smart homes to security, agriculture, military, and aerospace.

Recently, I visited Blumind’s team at their ventureLAB office and witnessed their  third-tapeout BM110 chip in action. I’m excited to see Blumind’s continued growth and how its transformative technology will reshape industries, making long-lasting, energy-efficient devices a reality.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Two Small Fish Ventures Celebrates the Merger of Printful and Printify

We’re thrilled to share that Printify, a company we have proudly backed since its first funding round, has entered into a merger with Printful (see report by TechCrunch). As long-time supporters of the Printify team, we at Two Small Fish Ventures are incredibly happy with this outcome, which marks a significant milestone in the production-on-demand industry and an exciting moment for everyone involved.

Printify and Printful are both leading platforms that empower entrepreneurs and businesses to create and sell custom products worldwide without the need to hold inventory, thanks to their advanced production-on-demand fulfillment networks. Printify has been growing rapidly, now boasting a team of over 700 employees. Combined with Printful’s team, the newly merged company will have well over 2,000 employees, making it by far the number one player in the production-on-demand market.

Printful, with over $130 million raised and a valuation exceeding $1 billion, and Printify, backed by $54.1 million in funding, have established themselves as the top two global leaders in this field. This merger solidifies their position as the dominant force in the industry, setting new standards and driving innovation in production-on-demand services worldwide. We’re proud to have supported Printify from the very beginning and look forward to witnessing the next chapter in their remarkable journey.

P.S. In true spirit of unity, founders Lauris Liberts and James Berdigans have sealed the deal by swapping T-shirts with each other’s logos—because nothing says “teamwork” like wearing the competition’s brand!

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Bridge Technologies are Rarely Great Investments

More than two decades ago, I co-founded my first company, Tira Wireless. The business went through several iterations, and eventually, we landed on building a mobile content delivery product. We raised roughly $30M in funding, which was a significant amount at the time. We even ranked as Canada’s Third Fastest Growing Technology Company in the Deloitte Technology Fast 50.

We had a good run, but eventually, Tira had to shut its doors.

We made numerous strategic mistakes, and I learned a lot—lessons that, quite frankly, helped me make far better decisions when I later started Wattpad.

One of the most important mistakes we made was falling into the “bridge technology” trap.

What is the “bridge technology” trap?

Reflecting on significant “platform shifts” over recent decades reveals a pattern: each shift unleashes waves of innovation. Consider the PC revolution in the late 20th century, the widespread adoption of the internet and cloud computing in the 2000s, and the mobile era in the 2010s. These shifts didn’t just create new opportunities; they also created significant pain points as the world tried to leap from one technology to another. Many companies emerged to solve problems arising from these changes.

Tira started when the world began its transition from web to mobile. Initially, there were countless mobile platforms and operating systems. These idiosyncrasies created a huge pain point, and Tira capitalized on that. But in a few short years, mobile consolidated into just two major players—iOS and Android. The pain point rapidly disappeared, and so did Tira’s business.

Similarly, most of these “bridge technology” companies perform very well during the transition because they solve a critical, short-term pain point. However, as the world completes the transition, their business disappears. For instance, numerous companies focused on converting websites into iPhone apps when the App Store launched. Where are they now?

Some companies try to leverage what they’ve built and pivot into something new. But building something new is challenging enough, and maintaining a soon-to-be-declining bridge business while transitioning into a new one is even harder. This is akin to the innovator’s dilemma: successful companies often struggle with disruptive innovation, torn between innovating (and risking profitable products) or maintaining the status quo (and risking obsolescence).

As an investor, it makes no sense to invest in a “bridge” company that is fully expected to pivot within a few years. A pivot should be a Plan B, not Plan A. It’s extremely rare for bridge technology companies to become great, venture-scale investments. In fact, I can’t think of any off the top of my head.

We are currently in the midst of a tectonic AI platform shift. We’re seeing a huge volume of pitches, which is incredibly exciting. Many of these startups built great technologies and products. However, a significant number of these pitches also represent bridge technologies. As the current AI platform shift matures, these bridge technologies will lose relevance. Sometimes, it’s obvious they’re bridge technologies; other times, it requires significant thought to identify them. This challenge is intellectually stimulating, and I enjoy every moment of it. Each analysis informs us of what the future looks like, and just as importantly, what it will not look like. With each passing day, we gain stronger conviction about where the world is heading. It’s further strengthening our “seeing the future is our superpower” muscle, and that’s the most exciting part.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Portfolio Highlight: #paid

#paid was one of the first investments we made at Two Small Fish Ventures. It’s been over a decade since we backed Bryan and Adam, who were still working out of Toronto Metropolitan University’s DMZ at the time. They had a vision to build a platform that connected creators and brands before “creator” was even a term! Back then, influencer and creator marketing campaigns were just tiny experiments.

A decade later, the creator economy has taken off. It’s now a $24 billion market—an order of magnitude larger than just a few years ago, with no signs of slowing down. The next wave of growth is still ahead as ad spending continues to shift away from traditional media. With the global ad market approaching $800 billion, one thing remains true: ad dollars follow the eyeballs—always. And where are those eyeballs today? On creators and influencers.

Today, #paid has become the world’s dominant platform, with over 100,000 creators onboard. It addresses a significant challenge: most creators don’t know how to connect with brands, especially iconic brands like Disney, Sephora, or IKEA. On the other hand, brands struggle to find the right creators amidst a sea of talent. #paid bridges this gap, acting as the marketplace that makes collaboration easy. They use data-driven insights to determine what makes a successful match, ensuring that both creators and brands can find each other effortlessly.

At #paid, brands and creators work with a dedicated team of experts to build creative strategies backed by research, first-party data, and industry benchmarks. This means campaigns run smoothly, allowing creators to focus on doing what they love—creating—without getting bogged down by administrative tasks.

I’m not just speaking as an investor—I’ve actually run a campaign with #paid as an influencer myself, and I can personally vouch for how seamless the experience was.

If you think #paid is all about TikTok, Snap, or Instagram, think again. Brands leverage #paid content across every platform. Want proof? Just check out the Infiniti TV commercial, which came from a #paid campaign.

How about billboards in major cities like NYC, Toronto, and more? #paid has that covered too.

#paid also brings creators and marketers together in real life. I had the privilege of speaking at their Creator Marketing Summit in NYC a few weeks ago, and I was amazed at how far #paid has come. The summit brought together hundreds of creators and top brand marketers—an impressive showcase of the platform’s evolution.

Looking back on this journey, here are my key takeaways:

• Great companies take a decade to build.

• To create a category leader, especially in winner-take-all markets, the idea has to be bold and often misunderstood at first. Bryan and Adam saw something that few others did, and their first-mover advantage has solidified #paid’s leading position today.

• There’s no such thing as “done.” #paid constantly reinvents itself. Generative AI is another exciting opportunity for step-function growth, and I can’t wait to see what’s next.

Bryan and Adam should be incredibly proud of what they’ve accomplished.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Venture Capital is Call Options on Startups

Early-stage venture capital (VC) has always been the oddball in asset management. Unlike other asset classes, it offers the highest potential returns, but it also comes with the highest variance—especially when portfolio construction isn’t done right. On top of that, it has an inherent “default rate” of about 80%.

Tell a traditional fund manager about this 80% default rate, and you’ll likely get a strange look.

A few months ago, I was trying to explain how VC works to a fund manager. After covering the usual points—how VC is essentially a home run derby with many misses—he paused and said, “I get it. VC is like buying call options on startups.”

I hadn’t considered it that way before, but he was absolutely right.

For those unfamiliar, buying a call option gives you the right, but not the obligation, to purchase a stock at a predetermined price (the strike price) before a specified expiration date. Investors use this strategy to profit from an anticipated—but not guaranteed—increase in the stock’s price. If the stock price rises above the strike price (plus the premium paid), the option becomes profitable. The potential profit is theoretically unlimited, while the maximum loss is limited to the premium paid.

Similarly, investing in a startup gives you the chance to acquire equity at an attractive price, with a ~20% chance the startup will take off—though this usually takes about a decade to materialize. VCs use this strategy to profit from a potential—but not guaranteed—rise in the company’s value. If the startup succeeds and its valuation soars beyond the investment (plus associated costs), the return can be massive. The potential profit is virtually unlimited if the company becomes a breakout success, while the maximum loss is limited to the initial investment.

VC and call options are strikingly similar, don’t you think? They’re like twins!

From now on, I’ll tell people: Venture capital is call options on startups.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Winning the Home Run Derby with Proper Portfolio Construction

TLDR – 20 companies in a VC portfolio is the optimal balance between risk and reward, offering a very high chance of hitting outsized returns without significant risk of losing money. This is exactly the approach we follow at Two Small Fish Ventures, as we keep our per-fund portfolio size limited to roughly 20 companies.

In my previous post, VC is a Home Run Derby with Uncapped Runs, I illustrated mathematically why early-stage venture funds’ success doesn’t hinge on minimizing failures, nor does it come from hitting singles (e.g., the number of “3x” companies). These smaller so-called “wins” are just noise.

As I said:

“Venture funds live or die by one thing: the percentage of the portfolio that becomes breakout successes — those capable of generating returns of 10x, 100x, or even 1000x.”

To drive high expected returns for VCs, finding these breakout successes is key. However, expected value alone doesn’t tell the full story. We also need to consider variance. In simple terms, even if a fund’s expected return is 5x or 10x, it doesn’t necessarily mean it’s a good investment. If the variance is too high—meaning the fund has a low probability of achieving that return and a high probability of losing money—it would still be a poor bet.

For example, imagine an investment opportunity that has a 10% chance of returning 100x and a 90% chance of losing everything. Its expected return is 10x (i.e., 10% x 100x + 90% x 0x = 10x). But despite the attractive expected return, it’s still a terrible investment due to the extremely high risk of total loss.

That said, there’s a time-tested solution to turn this kind of high-risk investment into a great one: diversification. While everyone understands the importance of diversification, the real key lies in how it’s done. By building a properly diversified portfolio, we can reduce variance while maintaining a high expected return. This post will illustrate mathematically how the right portfolio construction allows venture funds to generate outsized returns while ensuring a high probability of success.

Moonshot Capital vs. PlayItSafe Capital: A Quick Recap

Let’s start by revisiting our two hypothetical venture capital firms: Moonshot Capital and PlayItSafe Capital. Moonshot Capital swings for the fences, aiming to find the next 100x company while expecting most of the portfolio to fail. PlayItSafe Capital, on the other hand, protects downside risk (at least that’s what they think), but by avoiding bigger risks, it sacrifices the chance of finding outsized returns.

Moonshot Capital: Out of 20 companies, 17 resulted in strikeouts (0x returns), 3 companies achieved 10x returns, and 1 company achieved a 100x return.

PlayItSafe Capital: Out of 20 companies, 7 resulted in strikeouts (0x returns), 7 companies broke even (1x), 5 companies achieved 3x returns, and 1 company achieved a 10x return.

Here’s how their expected returns compare:

Moonshot Capital has an expected return of 6.5x, thanks to one company yielding 100x and three companies yielding 10x (i.e. (1 x 100 + 3 x 10 +16 x 0) x $1 = $130).

PlayItSafe Capital has a much lower expected return of 1.6x, with its highest return from one 10x company, five 3x returns, and several breakeven companies (i.e. (1 x 10 + 5 x 3 + 7 x 1 + 7 x 0) x $1 = $32).

Despite these differences in expected returns, what’s surprising is that counterintuitively, the probability of losing money (i.e., achieving an average return of less than 1x at the fund level) is quite similar for both firms.

Let’s dive into the math to see how we calculate these probabilities:

Moonshot Capital: 12.9% Probability of Losing Money

1. Expected Return :

2. Variance :

3. Standard Deviation :

4. Standard Error :

Using a normal approximation, the z-score to calculate P(X < 1) is:

Looking this up in the standard normal distribution table gives us:

P(X < 1) = 0.129 or 12.9%

PlayItSafe Capital: 11.6% Probability of Losing Money

Similarly, looking this up in the standard normal distribution table gives us (sparing you all the equations):

P(X < 1) = 0.116 or 11.6%

Shockingly, these two firms’ probabilities of losing money are essentially the same. The math does not lie!

Here’s a graphical representation of the outcomes (probability density) for Moonshot Capital and PlayItSafe Capital.

Probability Density Graphs: Comparing Moonshot and PlayItSafe

As you can see, Moonshot has higher upside potential, as the density peaks at 6x, while PlayItSafe is more concentrated around lower returns. Since their downside risks are more or less the same while PlayItSafe’s approach significantly limits its upside, counterintuitively PlayItSafe is far riskier from the risk-reward perspective.

Proper Portfolio Construction: How Portfolio Size Affects Returns

To further optimize Moonshot’s strategy, we will explore how different portfolio sizes affect the balance between risk and reward. Below, I’ve analyzed the outcomes (i.e. portfolio size sensitivity) for Moonshot Capital across portfolio sizes of n = 5, n = 10, n = 20, and n = 30.

The graph below shows the probability density curves for Moonshot Capital with varying portfolio sizes:

As you can see, smaller portfolios (n = 5, n = 10) exhibit higher variance, with a greater spread of potential outcomes. Larger portfolios (n = 20, n = 30) reduce the variance but also diminish the likelihood of hitting outsized returns.

Why 20 is the Optimal Portfolio Size

1. Why 20 is Optimal:

At n = 20, Moonshot Capital strikes an ideal balance. The risk of losing money, i.e. P (X < 1), remains manageable at 12.9%, while the probability of outsized returns remains high: 62.1% chance of hitting a return higher than 5x. This suggests that Moonshot’s high-risk, high-reward approach pays off without exposing the fund to unnecessary risk.

2. Why Bigger Isn’t Always Better (n = 30):

When the portfolio size increases to n = 30, we see a significant drop-off in the likelihood of outsized returns. The probability of achieving a return higher than 5x drops significantly from 62.1% at n = 20 to 41.9% at n = 30, and counterintuitively, the risk of losing money starts to increase. This suggests that larger portfolios can dilute the impact of the big wins that drive fund returns. It also mathematically explains why “spray-and-pray” does not work for early-stage investments.

3. The Pitfalls of Small Portfolios (n = 5 and n = 10):

At smaller portfolio sizes, such as n = 5 or n = 10, the variance increases significantly, making the portfolio’s returns more unpredictable. For example, at n = 5, the probability of losing money is significantly higher, and the risk of extreme outcomes becomes more pronounced. At n = 10, the flat-curve suggests that the variance is very high. This high variance means the returns are volatile and difficult to predict, increasing risk.

Conclusion: How to Win the Home Run Derby With Uncapped Runs

The key takeaway here is that Moonshot Capital’s strategy of swinging for the fences doesn’t mean taking on excessive risk. With 20 companies in the portfolio, Moonshot is the optimal between risk and reward, offering a very high chance of hitting outsized returns without significant risk of losing money.

While n=20 is optimal, n=10 is also pretty good, but n=30 is significantly worse. So, a ‘concentrated’ approach – but not ‘n=5 concentrated’ – is far better than ‘spray and pray,’ if you have to pick between the two.

This is exactly the approach we follow at Two Small Fish Ventures. We don’t write a cheque unless we have that magical “100x conviction.” We also keep our per-fund portfolio size limited to roughly 20 companies. This blog post mathematically breaks down one of our many secret sauces for our success.

Don’t tell anyone.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

WEBTOON IPO

I haven’t been involved with Wattpad for a while now, so it’s a strange feeling—though not in a bad way—to catch up on all the details about WEBTOON and Wattpad in the SEC filing. From what I’ve gathered, WEBTOON is performing exceptionally well, with revenue now surpassing $1 billion.

Three years ago, one of the main reasons I was drawn to Naver WEBTOON among all the suitors was Naver’s intention to spin out WEBTOON, together with Wattpad, as a separate, entertainment-focused, NASDAQ-listed company. This was a significant undertaking with numerous challenges, and the WEBTOON team is delivering on the promise. I’m pleased to see that Wattpad is playing a crucial role in this upcoming IPO.

The timing has turned out to be ideal for both WEBTOON and myself personally. With the rise of generative AI, the media industry is undergoing a new wave of massive disruption. It’s exciting to see WEBTOON raising more capital to seize this opportunity. From a distance, I wish the WEBTOON team all the best!

At Two Small Fish Ventures, we’re equally excited as we witness many incredible AI-native media startups and are actively investing in several amazing ones. I’ll share more about this in future posts.

This is a once-in-a-decade, platform-shift opportunity. It is arguably the biggest platform shift in the past century! TSF is actively investing in the next frontier of computing and its applications as a lead investor or as part of a syndicate. If you’re a founder of an early-stage AI-native company—media or not—don’t hesitate to reach out to us, as TSF is a rare investor who understands this space extremely well, and possibly the best investor with real-world operating experience who can help you achieve massive success like Wattpad did.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The depressing numbers of the venture-capital slump don’t tell the full story

Thank you to The Globe for publishing my second op-ed in as many weeks: The depressing numbers of the venture-capital slump don’t tell the full story.

The piece is now available in full here:

Bright spots in the current venture capital landscape exist. You just need to know where to look.

Recent reports are right. Amid high interest rates, venture capitalists have a shrinking pool of cash to dole out to hopeful startups, making it more challenging for those companies to raise funding. In the United States, for example, startup investors handed out US$ 170.6 billion in 2023, a decrease of nearly 30 percent from the year before.

But the headline numbers don’t tell the whole story.

There’s a night-and-day difference between the experience of raising funds for game-changing, deep-technology startups that specialize in artificial intelligence and related fields, such as semiconductors, and those who try to innovate with what’s referred to as shallow tech.

Remember the late 2000s? Apple’s App Store wasn’t groundbreaking in terms of technical innovation, but it nonetheless deserves praise because it revolutionized the smartphone. Then, the App Store’s charts were dominated by simplistic applications from infamous fart apps to iBeer, the app that let you pretend you were drinking from your iPhone.

That’s the difference – those building game-changing tools and those whose products are simply trying to ride the wave.

Tons of startups are pitching themselves as AI or deep-tech companies, but few actually are. This is why many are having trouble raising funds in the current climate.

It’s also why the era of shallow tech is over, and why deep-tech innovations will reshape our world from here on out.

Toronto-based Ideogram, a deep-tech startup, was the first in the industry to integrate text and typography into AI-generated images. (Disclosure: This is a company that is part of my Two Small Fish Ventures portfolio. But I’m not mentioning it just because I have a stake in it. The company’s track record speaks for itself.)

Barely one year old, the startup has fostered a community of more than seven million creators who have generated more than 600 million images. It went on to close a substantial US$80-million Series A funding round.

As a comparison, Wattpad, the company I founded, which later sold for US$660-million, had raised roughly US$120-million in total. Wattpad’s Series A in 2011, five years since inception, was US$3.5-million.

The speed at which Ideogram achieved so much in such a short period of time is eye-popping.

The “platform shifts” over recent decades have largely played out in the same way. From the personal-computer revolution in the late 20th century to the widespread adoption of the internet and cloud computing in the 2000s, and then the mobile era in the 2010s, there’s a clear pattern.

Each shift unleashed a wave of innovation to create new opportunities and fundamentally reshape user behaviour, democratize access and unlock tremendous value. These shifts benefited the billions of internet users and related businesses, but they also paved the way for “shallow tech.”

The late 2000s marked the beginning of a trend where ease of creation and user experience overshadowed the depth of innovation.

When Instagram launched, it was a straightforward photo-sharing app with just a few attractive filters. Over time, driven by the massive amounts of data it collected, it evolved into one of the leading social media platforms.

This time is different. The AI platform shift makes it harder for simplistic, shallow-tech startups to succeed. Gone are the days of building a minimally viable product, accumulating vast data and then establishing a defensible market position.

We’re entering the golden age of deep-tech innovation, and in order to be successful, startups have to embrace the latest platform shift – AI. And this doesn’t happen by tacking on “AI” to a startup’s name the way many companies did with the “mobile-first” rebrand of the 2010s.

In this new era, technological depth is not just a competitive advantage but also a fundamental pillar for building successful companies that have the potential to redefine our world.

For example, OpenAI and Canada’s very own Cohere are truly game-changing AI companies that have far more technical depth than startups from the previous generation. They’ve received massive funding partly because the development of these kinds of products is very capital-intensive but also because their game-changing approach will revolutionize how we live, work and play.

Companies like these are the bright spots in an otherwise gloomy venture-capital landscape.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Software Once Ate the World Alone; Now, Software and Hardware Consume the Universe Together

Over a decade ago, in his blog post titled “Why Software is Eating the World,” Marc Andreessen explained why software was transforming industries across the globe. Software would no longer be confined to the tech sector but permeated every aspect of our lives, disrupting traditional businesses and creating new opportunities, driving innovation and reshaping the competitive landscape. Overall, the post underscores the profound impact of software on the economy and society at large.

While the prediction in his blog post was mostly accurate, today, the world is still only partially eaten up by software. Although there are many opportunities for software alone to completely transform user behaviour, upend workflow, or cause other disruptions, the low-hanging fruits are mostly picked. That’s why I said the days of shallow tech are behind us now.

Moving forward, increasingly, there will be more and more opportunities that require hardware and software to be designed and developed together from the get-go to ensure that they can work harmoniously and make an impact that otherwise would not be possible. The best example that people can relate to today is Tesla. For those who have driven a Tesla, I trust many would testify that their software and hardware work really well together. Yes, their self-driving software might be buggy. Yes, the build quality of its hardware might not be the best. However, with many features on their cars – from charging to navigation to even warming up the car remotely – you can just tell that they are not shoehorning their software and their app into their hardware or vice versa.

On the other hand, on many cars from other manufacturers, you can tell their software and hardware teams are separated by the Grand Canyon and perhaps only seriously talk to each other weeks before the car is launched 🙂

We also witness the same thing down to the silicon level. From building the next AI chip to the industrial AI revolution to space tech, software and hardware convergence is happening everywhere. For instance, the high energy required by LLMs is partially because the software “works around” the hardware, which was not designed with AI in mind in the first place. Changes are already underway, ensuring that software and hardware dance together. There is a reason why large tech players like OpenAI and Google are planning to make their own chips.

We are in the midst of a once-in-a-decade “platform shift” because of generative AI. In the last platform shift more than a decade ago, when the confluence of mobile and cloud computing created a massive disruption, there was one “iPhone moment,” and then things progressed continuously. This time, new foundation models are launching at a break-neck pace, which is further exacerbated by open-source. So fast that we are now experiencing one iPhone moment every few weeks.

All of this happens when AI-native startups are an order of magnitude more capital-intensive than in the past cycle. At the same time, investors are also willing to write big cheques to these companies, but perhaps it is appropriate, given all the massive opportunities ahead of us.

Investing in this environment is both exciting and challenging as assessing these new opportunities is drastically different from the previous-generation software-only, shallow-tech startup. 

The next few years are going to be wild.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The Right Type of Investors

Most of Two Small Fish Ventures’ portfolio companies are based in North America. However, we also invest globally, as we firmly believe that global companies can be built anywhere. To us, where founders and their teams sleep at night is irrelevant to their potential for greatness.

Consequently, we actively engage with many tech ecosystems, regardless of their size. A pervasive issue we’ve encountered across these ecosystems is the challenge entrepreneurs face in finding investors who provide not just capital but the right kind of support. This problem is more acute in less developed ecosystems, but even those that are more established are not exempt.

An investor from another ecosystem eloquently discussed this issue in an article. I couldn’t have said it better myself, so with her permission, I’m sharing her insights here, albeit anonymized to avoid casting any ecosystem in a negative light. After all, this challenge is universal:

There are plenty of rich people and “wantrepreneur” investors in our community, but most of them have made their fortune in real estate, finance, or other traditional sectors. They have great intentions, but unfortunately they do not have experience in investing in technology and innovations. Some of them would take too much equity ownership. Some of them have conflicts of interest pursuing their own agendas and push their founders to work on products or customers that they want. Some are so risk averse that they structure their startup investment as if it is a personal loan. We have seen our startup founders take money from these investors and almost always end in disaster.  

​​What our community really needs are the startup investors who have “been there and done that.”  Or we will continue to be stuck in this vortex of wrong investors investing in the wrong companies. We need investors who truly understand the startup founders’ blood, sweat and tears approach. Someone who knows how to be a guide and a coach. Someone who knows how to provide advice, connections, and funding only when the founder really needs it.  

​​To achieve this goal, we need to invite investors from established ecosystems to teach local investors the best practices in venture investing. And we do believe these skills can be learned. The local investor community needs the knowledge and skills to make investment decisions that maximize the founders’ success therefore their chances of success.

Investing in innovation significantly differs from other forms of investment. For instance, real estate investments have established methods to evaluate rental yields, and traditional businesses use EBITDA to estimate enterprise values. However, early-stage startups, particularly those disrupting the status quo, cannot be evaluated using these metrics because of their lack of yields or EBITDA, or even clear business models! 

Often, experienced investors from other sectors mistakenly apply the same approach when they invest in tech startups, leading to almost certain failures. This can result in many problems, such as a messy cap table, ensuring the startup unfundable in future funding rounds and potentially “die young” despite its potential. We’ve regrettably had to pass on numerous investment opportunities due to such issues.

As the quoted investor highlighted, learning the skills and best practices in tech investing is possible. Needless to say, the best way to do this is to learn from people who have “been there and done that.” It’s crucial to acknowledge that investing in tech startups – and innovations in general – is a different sport than other sectors. 

After all, bringing a tennis racket to a hockey game is a recipe for disaster.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

VC is a Home Run Derby with Uncapped Runs

There’s an old saying that goes, “Know the rules of the game, and you’ll play better than anyone else.” Let’s take baseball as our example. Aiming for a home run often means accepting a higher number of strikeouts. Consider the legendary Babe Ruth: he was a leader in both home runs and strikeouts, a testament to the high-risk, high-reward strategy of swinging for the fences.

Yet, aiming solely for home runs isn’t always the best approach. After all, the game’s objective is to score the most runs, not just to hit the most home runs. Scoring involves hitting the ball, running the bases, and safely returning to home base. Sometimes, it’s more strategic to aim for a base hit, like a single, which offers a much higher chance of advancing runners on base and scoring.

The dynamics change entirely in a home run derby contest, where players have five minutes to hit as many home runs as possible. Here, only home runs count, so players focus on hitting just hard enough to clear the fence, rendering singles pointless.

Imagine if the derby rules also rewarded the home run’s distance, adding extra runs for every foot the ball travels beyond the fence. For context, the centre field is typically about 400 feet from home plate. So, a 420-foot home run, clearing the centre field by 20 feet, would count as a 20-run homer. This rule would drastically alter players’ strategies. Not only would they swing for the fences with every at-bat, but they would also hit as hard as possible, aiming for the longest possible home runs to maximize their scores, even if it reduced their overall chances of hitting a home run.

This scenario mirrors early-stage venture capital, where I liken it to a home run derby with uncapped runs. The potential upside of investments is enormous, offering returns of 100x, 1000x, or more, while the downside is limited to the initial investment. Unlike in a derby, where physical limits cap the maximum score, the VC world is truly without bounds, with numerous instances of investments yielding thousandfold returns.

This distinct dynamic makes assessing VCs fundamentally different from evaluating other asset classes, where protecting the downside is crucial. In the VC realm, the potential for nearly limitless returns makes losses inconsequential, provided VCs invest in early-stage companies with the potential for exponential growth. The risk-reward equation in venture capital is thus highly asymmetrical, favouring bold bets on moonshot startups.

For illustration, let’s consider two hypothetical venture capital firms: Moonshot Capital and PlayItSafe Capital.

Moonshot Capital approaches the game like a home run derby with uncapped runs. They aim for approximately 20 companies in their portfolio, expecting that around 20% will be their home runs—or “value drivers”—capable of generating returns from 10x to 100x or more. 

Imagine they invest $1 in each of 20 companies. One yields a 100x return, three bring in 10x, and the remaining are strikeouts. The outcome would be:

(1 x 100 + 3 x 10 +16 x 0) x $1 = $130

Their $20 investment becomes $130 (or 6.5x), a gain of $110, despite 17 out of 20 companies being strikeouts. Yes, you are correct. 85% of the portfolio companies failed!

PlayItSafe Capital, on the other hand, prioritizes downside protection, ensuring none of the portfolio fails but also avoiding riskier bets. In the end, one company generates one “10x” return, five companies return 3x, and the remainder is equally split between breakeven and failing.

(1 x 10 + 5 x 3 + 7 x 1 + 7 x 0) x $1 = $32

Despite several “successes” and very few “losses,” the fund’s return of $12 pales in comparison to Moonshot Capital’s. Even increasing the number of companies generating a 3x return to 10 with no loss (which is almost impossible to achieve for early-stage VCs) only yields a $29 gain from a total investment of $20:

(1 x 10 + 10 x 3 + 9 x 1) x $1 = $49

No one should invest in the early-stage VC asset class with the expectation of such a paltry return.

As illustrated, success isn’t about minimizing failures, nor is it about the number of “3x” companies or even the number of “unicorn logos” in the portfolio, as how early when the investment was made to these unicorns is crucial as well. One needs to invest in a unicorn when it was a baby-unicorn, not after it became a unicorn.

In summary:

Venture funds live or die by one thing: the percentage of the portfolio that becomes “value drivers”, i.e. those capable of generating returns of 10x, 100x, or even 1000x.

At Two Small Fish Ventures, we are the IRL version of Moonshot Capital. Every investment is made with the belief that $1 could turn into $100. We know that, in the end, only about 20% of our portfolio will become significant value drivers. Yet, with each investment, we truly believe these early-stage companies have the potential to become world-class giants and category creators when we invest. 

This is what venture capital is all about: not only is it exhilarating to be at the forefront of technology, but it’s also a great way to generate wealth and, more importantly, play a role in supporting moonshots that have a chance to change how the world operates.

P.S. This is Part 1 of this series. You can read Part 2, “Winning the Home Run Derby with Proper Portfolio Construction” here.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Assessing Different Asset Classes

Diversifying a portfolio across various asset classes is the first principle for enhancing returns without significantly increasing risk from an investment standpoint. Traditionally, the go-to formula has been a 60/40 split—60% in stocks and 40% in bonds, a practice primarily due to the limited accessibility of alternative asset classes. However, recent years have seen a democratization of access to a wider array of asset classes, including private equity, venture capital and numerous alternatives, opening doors for more investors to explore areas once reserved for the privileged few. This broadening of opportunities is undoubtedly beneficial to many.

Yet, it introduces a new challenge: How do we assess fund managers across different asset classes? This task can be daunting even for seasoned investment professionals, as investing encompasses a vast range of specialties. A common mistake is posing the wrong questions, as assessment criteria are not interchangeable across asset classes. It is akin to comparing athletes from different sports—evaluating NBA players is not the same as evaluating MLB players since each asset class is akin to a distinct sport. For instance, inquiring about the batting average of an Olympic gold medalist swimmer is as illogical as expecting an NBA MVP to be proficient with a baseball bat. 

It’s also unwise to question a fish on its ability to skate!

This blog post is the first in a series designed to demystify this process. I do not claim expertise in all asset classes—no one can. However, I hope to share my experiences to help you sidestep common mistakes and empower you with the basics to evaluate investment opportunities in unfamiliar territories, especially early-stage venture capital, which is my swim lane and relatively few people have the experience to assess. Please note, this blog post does not constitute investment advice or a comprehensive guide across all asset classes as we only cover a handful for illustration purposes. 

Here is a chart that highlights the key differences:

How should you interpret this chart? Let me use early-stage venture capital, or simply referred to as VC, as an example.

Assessing VC is more art than science and more qualitative than quantitative. It offers far higher return potential than almost any other asset class. On the other hand, the risk of losing money is also higher than in other asset classes, with the predictability of the potential target return being low and its variance high.

Individual investments within a fund portfolio have a very high failure rate, even for the best funds. This is by design because VC is a home run derby. Strikeouts, singles, or doubles don’t impact the return at all, as only the home runs count. This is unique to VC and counterintuitive to managers from other asset classes.

The dispersion among fund managers is also much higher, as the top decile funds generate significantly better returns than the rest. Vintages also make a far more significant influence, as market downturns have an outsized impact on fund returns, even for the best funds. However, the best funds still generate very good returns during bad years. These funds simply generate enormous returns during the good years!

VC takes a decade or more to generate returns. The first few years usually have nothing to show for because it takes a few years to find the startups to invest in, and they take time to grow and realize the gain. Because of this, VC funds are usually illiquid.

On the other extreme, fixed income is more science than art. It is number-driven, much more predictable, and has lower returns, but any default is a cardinal sin!

Each row on the chart deserves a separate blog post. Stay tuned for subsequent posts in this series, where we’ll dive deeper into these topics.

This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

The Second Act, the Third Act and the Fourth Act

One of the three things that CEOs only do is to “make sure there is enough cash in the bank” (see job #3 here). Although CFOs may be responsible for much of the heavy lifting, keep in mind that CEOs’ job #1 is to communicate vision and strategies to all stakeholders, which certainly includes potential and existing investors. It is very hard to raise capital to build a great company without great storytelling skills, something almost all great CEOs possess.

Clearly communicating a bold vision is especially important for early-stage venture-backed companies. These companies are usually pre-revenue, pre-product-market-fit, and definitely pre-scaling. From the VCs’ perspective, they invest not only in where the company is today, but also where the company would be, could be, and should be. In many cases, investors buy into the company’s second, third, and fourth acts in the future, as very few great companies are one-trick ponies.

SRTX is the perfect example. Last week, we went to the grand opening of their mega-factory in Montreal. To my knowledge, it is now the largest textile factory in Canada. The pictures and videos don’t do justice to the massive scale of this facility.

This is especially impressive when you know that 180 days ago, when they took over the facility, the roof was leaking, there were no walls, and there was no electricity. The SRTX team moved mountains, rock by rock and at lightning speed, to get the factory ready for production. 

I wish I could share some pictures inside the factory. Unfortunately, I can’t share their secret sauce. If you really want to have an insider view, you have to become an investor 😉

It took 7 years from its inception for SRTX to begin evolving into a fully verticalized behemoth through innovations in advanced material, hardware, and software to deliver traceability, sustainability, durability, and cost advantages, which is now giving them an “unbreakable” advantage – pun fully intended!

Today, millions of Sheertex unbreakable pantyhose are sold. They became THE best-selling pantyhose, unbreakable or otherwise, in North America, not bad for a 15-person company based in Bracebridge, Ontario, a town with a 15,000 population and a 2-hour drive north of Toronto when Two Small Fish Ventures invested!

Now, they are ready to license the IPs of their rip-resistant technology to other textile companies. That’s their second act. Watertex, one of the world’s most hydrophobic polymers that is engineered for unparalleled water resistance for use in, say, swimwear, is their third act. There are other IPs that are in the works. I would call them their fourth act.

But please don’t use the word pivot here. Pivot implies ‘nothing works, let’s try something else.’  Since the early days, Katherine was very clear that selling pantyhose online was the necessary first act to give her the economy of scale before she could begin her second act, third act, and fourth act. What we see today is exactly how she articulated her bold vision when we invested in the seed round five years ago. We bought into her vision, joined the journey, and now, what she told us is becoming a reality. We wouldn’t have invested in a company that was merely selling pantyhose online, even if millions were being sold.

The power couple, Katherine Homuth and Zak Homuth, are not your typical founders. SRTX is rewriting the rules of textiles through innovations. I can’t wait to watch the second, third, and fourth acts unfold right before our eyes from my front-row seat.

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Founding CEOs vs. Professional CEOs

Silicon Valley’s founder CEO worship definitely has its merits. As a CEO backed by many valley VCs, I have immersed myself in that view for decades (e.g., Ben Horowitz’s Why We Prefer Founding CEOs). I get it, I understand where it comes from, and I do mostly agree. That’s why TSFV backs founding CEOs almost 100% exclusively.

Great founding CEOs tend to have all three traits: 1) Comprehensive knowledge of the entire company (including knowledge of every employee, product, technology decision, customer data, and the strengths and weaknesses of both the code base and the organization), 2) moral authority, and 3) total commitment to the long-term, while professional CEOs often don’t.

On the other hand, being a great CEO is more than just starting a company. It’s a super stressful job that nobody can learn overnight, and running a company with hundreds or thousands of employees is definitely a different ball game than being a founding CEO of a five-person company. However, founders who can’t scale with the company can’t stay in the captain’s chair forever.

If the two jobs are so different, why do we still prefer founding CEOs, even though many are learning on the job? Because it gives the company the best chance to become ultra-successful.

Typically, a company goes through four stages of growth. I call it the “4S’s”:

  • Start: where everything begins, with just the co-founders and a tiny team.
  • Sprout: achieving product-market fit, with the CEO calling most of the shots in a mostly informal setting.
  • Scale: rapid growth, hiring functional leaders, building depth, and starting to establish business processes. This is often where founder CEOs, especially first-time founder CEOs, stumble as they might lack experience in hiring and leading large teams.
  • Success: achieving a major milestone like an IPO or a massive liquidity event.

But the growth of a company isn’t a waterfall. An innovation company can’t stop innovating once its (first!) product has achieved product-market fit and cannot simply switch gears overnight to focus on business optimization. The most successful companies aren’t one-trick ponies; they need second and third acts long after their first product takes off.

Based on my own experience and my observation of hundreds of CEOs’ personal growth, I can confidently say that it’s far easier for a founding CEO to learn leadership than for a professional hire to become innovative and visionary. When the company hits scale-up mode, a founding CEO’s leadership needs to be solid, but any gaps can be filled by hiring strong leaders. Most founders can successfully make this jump.

On the flip side, pushing someone to be innovative and visionary is much harder, as is finding a team of leaders who can fill that gap for a professional CEO. That’s why it’s tougher for professional CEOs to succeed, though it’s not impossible. It is also possible to hire an “entrepreneurial” professional CEO, although they are rare gems.

However, this is all pretty generalized. Generalization tends to default to pattern recognition without thoughtful consideration of the specificity of the company’s situation. The ideal scenario is a founding CEO leading all the way, but sometimes, if a professional CEO is the only option, that’s what we have to work with.

The good news for TSFV’s portfolio CEOs is that you’ve got a founding CEO who’s been through it all – me! These days, I spend a lot of time helping founding CEOs fast-track their learning to operate more effectively on the job. For our professional CEOs, I offer guidance to help them think and act more like founders. Helping our portfolio CEOs is the best use of my time to ensure our portfolio companies’ success. It is also extremely high-leveraged because sometimes, even a 30-minute conversation with me can help change the trajectory of a company. After all, if our CEOs aren’t successful, it’s nearly impossible for our portfolio companies to be successful, isn’t it?

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

A New Year Begins: Chasing More Olympic Gold Medals

It has been three years this month since Wattpad was at the centre of one of the largest tech acquisitions in Canadian history. At that time, as team captain, I celebrated an Olympic gold medal win along with the amazing Wattpad team.

Today, a year and a half has passed since I stepped aside from my CEO role, a position I held for 15 years since founding the company. Even a few years before the acquisition, I had already decided it would be my last stint as a CEO. As much as I loved my role, the idea of starting another company from scratch is not appealing to me, as I didn’t want to repeat the same journey over and over again. That’s why I said it’s the final curtain call of my career as a CEO. There was no ‘never say never’ in my decision.

But if you think I would simply sail into the sunset, you are mistaken. That is simply not who I am.

I am naturally a very curious person, always eager to understand how things work. My interests span a wide range of science and technology, from software to semiconductors, quantum to telecom, and everything in between. That’s my obsession.

To me, being ‘the coach’ of a winning team is far more fulfilling than being ‘the captain’ one more time. It is a different challenge, yet it fully utilizes my knowledge, skill, and experience in scaling from 0 to 100. Moreover, the timing couldn’t be better as we are experiencing a once-in-a-decade ‘platform shift’ in the midst of global AI disruption across all industries. Having pioneered AI-driven storytelling at Wattpad, AI is in fact one of my superpowers!

But why limit myself to just one team? Supporting multiple amazing teams simultaneously in building world-class, iconic tech giants and category creators is even better!

It’s a long-winded way of saying that after a year and a half in my post-CEO life, I can 110% confirm that being a venture capitalist is my dream vocation. I can do this forever!

The beast is now fully awakened. My burning desire for more wins has never been stronger. I feel like I am going to the Olympics again, only this time as an investor. Look forward to an amazing 2024, when TSFV and our portfolio companies bring home more gold medals.

Happy New Year, everyone!

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

It is Time to Convince Canadians Canada is Great!

I was reading a report from the investment firm LetkoBrosseau, which highlights how minimally the Canadian pension system is investing in Canada. Their headline caught my attention:

“Canada Has Cut Back On Investing In Its Greatest Asset – Itself.”

Canadian pension funds largely invest our money outside of Canada. Given Canada’s population size, it’s not unreasonable for our pension funds to look abroad, but the pendulum may have swung too far. That’s a topic for another day, however.

One particular slide, slide 4, jumped out at me, presenting several not-too-fun facts:

  • Canada’s GDP per capita has steadily declined to 75% of that of the United States, down from near parity 40 years ago. One of the main reasons is Canada invests substantially less in our own startups, R&D, and our workers.
  • In 2023, American investment per worker is 2.25x that in Canada. It was near parity 40 years ago.
  • In R&D intensity (the ratio of a country’s R&D expenditures to its GDP), the US is at 3.5, Japan at 3.3, Germany at 3.1, the G7 average at 2.6, France at 2.4. Canada lags at 1.9.
  • Canada is underinvesting in its own startups: For every dollar Canada invests in venture capital, Israel invests $2 (despite Israel’s economy being a quarter the size of Canada’s), and the US invests $39. This means that on a per capita basis, Israel invests 8 times more than Canada, and the US 4 times more.
  • Moreover, Canadians only provide about 33% of the funding for their own startups, with the remaining 66% coming from other countries. At Wattpad, we observed a similar ratio. Our largest investors were Union Square Ventures (NYC), Khosla Ventures (Silicon Valley), OMERS (Canada), August Capital (Silicon Valley), and Tencent (Asia). As you can see, most of them are not Canadian, highlighting a limited appetite for investing in our own innovative ventures.

But it’s not just about pension funds. The awareness and appetite to invest in venture capital as an asset class are significantly lower among family offices and endowments in Canada. For example, in the US, it’s not uncommon for university endowments to allocate over 20% to VC. In Canada, many are at zero or in the low single digits.

But it all depends on whether you’re a glass-half-full or glass-half-empty person.

I’m a glass-half-full person. This is clearly a market gap, and market gaps create opportunities.

A decade ago, when Wattpad began raising capital from Silicon Valley, Valley VCs didn’t ask me ‘if’ I would move the company there; they asked ‘when.’ I told them, ‘I won’t move.’ They were all surprised to hear from me that building the company in Canada would be far better due to less competition for talent, paradoxically allowing us to hire and retain top talent more easily. Wattpad was one of the first to commit to scaling our company in Canada, successfully proving (to them) that a world-class tech company could be built here (obvious to me). The Wattpad team played a part in reshaping the narrative of Canada’s innovation ecosystem.

I am very committed to doing it again. This time, I’m not convincing people in the Valley that Canada is great; I’m convincing Canadians that Canada is great! My goal is to encourage more attention towards VC as an asset class. As a VC myself, I’m putting my money where my mouth is, and I will let our results speak for themselves. For many decades, Americans and Israelis have known that investing in top-tier VCs can help create world-class, iconic companies, benefiting their local economies significantly while also generating consistent, outsized returns. Canada can undoubtedly do the same.

This is my last post of the year. I’ll be “off the grid” until the new year, recharging for what promises to be a super busy 2024. Happy holidays!

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.

Allen’s Thoughts 2.0

One of the most unusual practices I used as CEO was writing an internal blog called “Allen’s Thoughts” on Wattpad every day. My preferred form of communication is the written word, a key reason behind co-founding Wattpad.

Although it might sound time-consuming – and it is – blogging helped me tremendously in clarifying my thinking. More importantly, context matters. The 30-60 minutes I spent each day aligned and interacted with hundreds of employees, arguably making it the most effective activity in terms of leveraging time. Here’s what I explained on Allen’s Thoughts about why I needed to do this:

“Wattpad is an incredibly complex company. We are a tech company, a media company, a book publisher, an advertising company, an influencer network, an AI company, a movie studio, a social network, a community, and also an entertainment company that makes people happy.

What links us together is our common vision, mission, values, and culture. Allen’s Thoughts is less about the numbers and company updates, which you can get on Slack, email, Google Docs, or other channels. This blog is more about sharing the context, the whys, and the intangibles in a narrative that helps you navigate that complexity so that you can make the best possible decisions and do your best job.

This blog is one of my unique superpowers that connects everyone.”

I started Allen’s Thoughts in 2013 and stopped daily blogging after stepping down in May 2022. My final post, “IT’S THE FINAL CURTAIN CALL. A NEW STORY BEGINS,” was shared publicly on allensthoughts.com.

Do I miss it? Absolutely, yes. However, after writing half a million words, I became too mentally exhausted.

After a long break, I am fully recharged and ready to reactivate my public blog. Although the Wattpad story is well-documented, many challenges and triumphs weren’t shared externally. These backstories are valuable case studies in business, leadership, entrepreneurship, venture capital and even time management. Re-reading my old posts, I realized they are a startup treasure trove, offering insights from scaling from two co-founders to a scaleup with hundreds of employees and 100 million users. I plan to share these lessons, along with many new topics.

Of course, I will also share my perspective on the startup investment landscape, our investment thesis, and our areas of focus – i.e., AI, protocols, and sustainable computing – among other topics.

This material will be part of our “School of Fish” Masterclass Series, more on this later.

I don’t plan to write daily. Frequency is not the most important aspect; it’s more about when inspiration strikes. My goal is to share high-quality, high-leverage, and impactful content. I will use Allen Thoughts to think things through “in public,” writing for my own enjoyment and hoping it benefits many others. After a hiatus, I’m eager, hungry, and excited to do it again!

P.S. This blog is licensed under a Creative Commons Attribution 4.0 International License. You are free to copy, redistribute, remix, transform, and build upon the material for any purpose, even commercially, as long as appropriate credit is given.