2025
About

We invest in people that we respect, trust, admire, and who have set out to build iconic organizations in the most fundamental categories.

theme shape animation
Core Theme

Fundamental Consumer Trends

Consumer technology and products are a meaningful part of any individual’s experience of the world. We invest in companies that enable consumers to have fundamentally better experiences and empower them to make better-informed decisions.

News

A.I. Needs Copper. It Just Helped to Find Millions of Tons of It

07/12

The deposit in Zambia could generate billions, provide minerals for the energy transition, and help the United States secure critical supply.

New York City
Berlin

Feb 21, 2025
3:25

Feb 21, 2025
9:25

News

KoBold Metals discovers vast Zambian copper deposit

02/05

KoBold Metals has found Zambia’s largest copper deposit in a century. The company estimates that the Mingomba site in the northern Copperbelt province will become one of the world’s top three high-grade copper mines.

Portfolio Founder

Kurt House

Kurt House is CEO & Co-Founder at KoBold Metals. Kurt is based in San Francisco, USA.

New York City, USA

few clouds 2°C

theme shape animation
Core Theme

Infrastructure Tech

We know that there are a few areas where fundamental and enduring infrastructure technology will be created and widely adopted in the coming years. We invest in companies that are building significant pieces of this infrastructure.

News

KoBold Metals valued at USD 2.96bn

01/01

KoBold Metals is leading the race for critical minerals needed for energy transition.

Portfolio Founder

David Ha

David Ha is CEO & Co-Founder at Sakana AI. David is based in Tokyo, Japan.

Research Overview Cards Research Overview Cards Research Overview Cards
World View

Hardware Development Tooling

Enabling the physical technology supercycle

February 2025

Infrastructure Tech

Our ongoing exploration of the hardware development stack, from AI-driven CAE to PCB automation, has consistently pointed us toward a fundamental challenge: the immense complexity of coordinating diverse tools, stakeholders, and workflows across the hardware development lifecycle. While individual design tools have evolved, the job of orchestrating these pieces – managing requirements, test data, manufacturing handoffs, and team collaboration – remains a major bottleneck.

As Western economies pour unprecedented capital into hardware innovation across aerospace, energy, and defense, an entirely new class of hardware companies is emerging. And they are building with fundamentally different expectations around tooling and development speed. The incumbent hardware solution stack fails to meet these heightened expectations – it is fragmented across systems, heavily manual, and lacks real-time visibility. 

As a result, we have seen many emerging hardware companies rolling their own solutions to solve internal and external orchestration across the development lifecycle. Stoke Space’s Fusion, an internal tool that they externalized, is one such effort. This trend, which we have seen inside of several other companies, signals both the severity of existing tooling gaps and validates demand for better solutions.

As such, we see a Stripe-like opportunity to enable and capture a portion of the value created by this new class of companies through the type of critical, but boring, infrastructure that we have deemed “plumbers for reindustrialization” in other research.

We see three primary areas of opportunity for new companies at the orchestration layer:

Test Data & Observability: The proliferation of sensors and testing equipment has created data noise that existing tools can't handle effectively. Real-time analysis of test data, coupled with AI for anomaly detection and optimization – DevOps-like telemetry and monitoring – could transform validation processes that historically relied on manual review and tribal knowledge.

Unified Data & Collaboration Hubs (Next-Gen PLM): The shift toward distributed engineering teams and expansive supply chains (e.g. via outsourcing) has exposed the limitations of current tools. Engineers spend a material amount of their time on non-value-added work like converting files, updating documents, or searching for the latest designs. Modern cloud-based hubs that unify product data (requirements, CAD, tests) could dramatically improve productivity.

Manufacturing Orchestration: The gap between design and manufacturing is a major bottleneck. Tools that automate the translation of designs into manufacturing instructions and provide real-time feedback on manufacturability could significantly reduce iteration cycles and costs.

New platforms built specifically for these emerging workflows – distributed by default, data-intensive by design, and automation-ready from the start – are naturally advantaged.

From a go-to-market perspective, focusing on emerging hardware companies helps orchestration companies avoid legacy processes and tooling and instead focus on shaping modern development workflows. These companies are building complex hardware under intense time (and capital) pressure – they need tools that can keep pace. As these tools prove their value to early adopters, they can expand both vertically (serving larger enterprises) and horizontally (connecting more parts of the development process). 

However, this means our thesis relies on this new class of hardware companies being a durable and scalable customer base. If the end game is dozens of sub-scale acquisitions and a select few successes – leaving today’s incumbent hardware OEMs as the primary market – the entrenchment of existing tooling and orchestration companies (from Siemens to Jama to PTC) will be harder to break.

Similar to what we have concluded in our research into AI-driven CAE, success doesn’t require displacing incumbent tools outright. Rather than competing head-on with entrenched CAD/CAE/PLM systems, new platforms can focus on making these tools work better together – becoming the connective tissue that coordinates modern hardware development. Once established as coordination layers, these platforms position themselves to expand their footprint over time.

The PLM and hardware development tooling market can already be measured in the tens of billions, but we believe the truly transformative companies will win by expanding the market and helping hardware companies iterate and build at the speed of software. This acceleration creates a powerful flywheel: faster development cycles enable more products, which drives increased tool usage and data generation, further improving development speed. Just as software development tools expanded their market by enabling faster iteration cycles, we believe the winners in hardware orchestration will grow the market by unlocking new levels of development velocity.

The risks are real – long sales cycles, integration complexity, and regulatory requirements in sectors like aerospace and defense. But we believe the confluence of market demand (driven by reindustrialization), technological convergence, and incumbent blindspots create a unique opportunity for new platforms to emerge.

July

Thesis

LLM Application Deployment: Resilience and Optionality

Today, the deployment of generative AI solutions into the enterprise, particularly large companies, has started and often exceeds expectations.

January 2025

Infrastructure Tech

The generative AI era presents an interesting paradox – strong confidence in the directional arrow of technological progress (the ever-expanding and evolving LLM blast radius) coupled with significant uncertainty around the economic implications, both macro and micro. Today, the deployment of generative AI solutions into the enterprise, particularly large companies, has started and often exceeds expectations

At the same time, there is wide agreement that while early applications are driving positive ROI, most organizations face a significant change management problem in fully incorporating them into existing operational frameworks - “there are no AI-shaped holes lying around.”  For many enterprises and their executives, this has led to a “flight to trust,” with large consulting firms benefitting from enterprise appetite to utilize generative AI. This uncertainty around future enterprise workflows is also furthermore reflected in the observation that most AI startups that have found traction have done so with an anthropomorphized approach, “selling the work” in areas like legal support and accounting – essentially building an end-to-end AI replica of what customers have come to expect from human + software. 

While we think great business can be built there, this can’t be all. We believe that as organizations and society develop a better understanding of AI, we build core workflows around this new paradigm, constantly adapting existing organizational structures and coming up with entirely new ones. We broadly differentiate between resilience and optionality and believe that both areas provide opportunities for interesting models and platforms to emerge. 

Resilience focuses on enabling existing companies forced to adopt AI (in secure) and effective ways to stay competitive. As described above, these companies already have processes and employees. Both might have a hard time adopting. 

As with any complex system, we believe there is a unique opportunity by looking at the smallest unit in organizations - employees. While executives and consultants try to implement AI policies top-down, high-agency individuals (armed with ChatGPT, Claude, Cursor, and the latest tools) are constantly discovering productivity enhancements built around their idiosyncratic workflows, often utilizing these tools without explicit permission. We see an opportunity to push much of the work of identifying enterprise-specific best practices to these forward-thinking individuals and for a novel platform focused on this bottom-up approach to AI resilience to emerge.

In the process, such a platform could kill two birds with one stone. It provides a starting point for better data controls and security processes to manage risk while helping companies understand the financial implications (productivity improvements, cost structure changes, etc.) of continued AI deployment. 

Furthermore, monitoring and visibility in AI use by employees help enterprises gain insight into best practices (making AI fit into existing holes) that can be rolled out across the organization. The big opportunity that emerges from this wedge and model for enterprise trust building is that such a platform positions itself as we move toward a world of “spontaneous software” and possibly, “AI as the last employee” – similar to how Workday came to define ERP for the “digital transformation” era. 

Optionality focuses on building companies around novel organizational structures with a view to the upside, native to AI workflows and not possible before. 

This is an extension of what we previously wrote on “spontaneous software and SaaS on-demand”. In line with a recent post from Nustom that draws parallels from the autonomous vehicle market to propose the idea of a self-driving startup, we believe that there is a massive opportunity here for companies that operate like mobile game studios, making use of the reality that software is increasingly cheaper to write and startups cheaper to run with AI getting better and more capable at both. We expect these companies will excel at rapid experimentation and iteration, consistently positioning themselves ahead of the capability curve to try to catch lightning in a bottle (hits-driven) or/in combination with being long-tail driven with a large number of small cashflow-generating businesses under one roof. 

July

Thesis

Digital Olfaction

Some of AI's greatest opportunities lie in its application to understanding and transforming the physical world.

January 2025

Infrastructure Tech

Some of AI's greatest opportunities lie in its application to understanding and transforming the physical world. We believe in the potential of convolutional neural networks, GNNs, and transformers to help us deal with this complexity and make sense of the world in ways that we have not been able to (we internally call these "expansion" use cases). This theme runs through several of our investments, most notably KoBold Metals. 

We believe that digital olfaction and a better understanding of how molecules make us smell are among those areas. Scent, due to its complexity, is our least understood sense. Novel approaches to machine learning, such as GNNs, have been proven to cut through this complexity and beat the human nose at scent profile classification based on molecule structures. Osmo, the company at the forefront of this research, has proven that it can utilize this understanding to develop novel scents. It is reasonable to assume that this technology will enable the faster development of novel molecules at lower cost and at scale. 

In 2023, the global aroma chemicals market was valued at approximately USD 5.53 billion (unclear if this also includes produced chemicals vs. only IP). This market is essentially dominated by a few large players:  Givaudan, Firmenich, IFF, and Symrise. All these players are fully integrated, meaning they both develop scent molecules (IP) and manufacture them. It is unclear how much value is in the pure IP, but some tailwinds could favor the emergence of a novel AI-enabled player focused on novel IP. In 2023, a class action lawsuit was filed against major fragrance companies Givaudan, Firmenich, IFF, and Symrise. This followed multinational (Switzerland, Europe) antitrust investigations into suspected price-fixing by these companies initiated earlier the same year. Moreover, there is a marketable shift in the industry to focus on sustainable molecules that don’t require scarce resources and have no negative health effects. 

The ability of AI to generate such molecules that either have novel scent profiles or are similar to existing ones without having negative externalities (e.g., associated with production health) is likely a unique fit for these AI models. We expect that to create maximum value, such a model (or suite of models) would likely need to be capable of 1) the ability to model molecule interaction to create a whole scent, 2) an understanding of constraints (e.g., toxicity, costs) and 3) the ability to assess the producibility of these molecule sets at scale. 

Moreover, we see substantial potential for market expansion. Suppose these AI systems are capable of identifying, mapping, and predicting the behavior of scent molecules, given certain hardware advancements (essentially a chip capable of detecting, analyzing, and recreating scent) are made. In that case, several new application areas emerge: These can span from environmental monitoring to medical diagnostics, where AI can detect disease biomarkers through molecular analysis to consumer applications such as capturing, reproducing, and sharing scent online. White is hard to quantify, but it is reasonable to assume that there is substantial option value.

July

Thesis

Space Debris

The number of objects we are sending to space is growing exponentially.

January 2025

Infrastructure Tech

The number of objects we are sending to space is growing exponentially. Thanks to SpaceX, launch costs have fallen 80-90%. While it took nearly 60 years to put 2,000 satellites in orbit, we launched 3,000 in 2023 alone. 100k satellites are expected to launch by 2030, marking a further increase in the complexity of space operations. 

As old space assets deteriorate and more are launched, collisions are inevitable, particularly as a result of space debris. Satellites already make 100,000+ maneuvers per year for collision avoidance. Losses due to collisions in low earth orbit were estimated at ~ $100m in 2020. Since then, we have at least tripled the satellite population.

While responsiveness is improving (e.g. edge GPUs to enable on-board autonomy), hundreds of billions of dollars in assets are and will be exposed without i) precise monitoring, ii) proactive defense systems (beyond trying to move out of the way), and iii) adequate financial risk management (i.e. insurance models). 

While it is easy to forget amid the current space market momentum, the industry is still walking a fine line – something that seems to have motivated Elon’s all-in effort in the direction of Donald Trump’s election. As the nuclear industry has demonstrated, the public perception pendulum in highly sensitive industries can swing toward the negative (for decades) with only a small number of high-profile failures. Space debris is the type of problem that, left unchecked, poses a Three Mile Island-style risk for the industry. Also like most “waste” related problems, there is often not a strong enough incentive for any single actor to solve it until it is too late. 

The fragmented incentives and control mechanisms in addressing space debris are evident in the current regulatory frameworks. 

The United States is a patchwork of policies, with agencies like the FAA, FCC, and NASA each taking different approaches to limiting or removing space waste. Europe’s approach has been more comprehensive with the European Space Agency (ESA) developing the "Zero Debris Charter," aiming to prevent the generation of new orbital debris by 2030. As of October 2024, 110 countries or entities have joined the charter, and discussions are ongoing with major satellite operators for participation. 

Despite these initiatives, the absence of a binding international legal framework leads to a "tragedy of the commons" scenario, where individual actors may lack sufficient incentives to invest in debris mitigation (opting instead to accelerate commercial advances amid increasing competition, resulting in increased collective risk.

International cooperation around debris is also threatened by geopolitical posturing. Without better visibility and defense mechanisms, nation-states will always have plausible deniability around the destruction of important satellites and space infrastructure (“it wasn’t us, it was debris”). Since even 1mm fragments of debris can disable a satellite entirely, this is not too much of a logical leap.

We believe that solving the problem of space debris creates an interesting wedge for companies to eventually become critical infrastructure for space security and risk management.

July

Thesis

AI-enabled Business Planning

Giving organizations on steroids what Cisco called its ‘virtual close’ advantage more than 20 years ago. 

January 2025

Industry Transformation

The generative AI era presents an interesting paradox: strong confidence in the directional arrow of technological progress (the ever-expanding and evolving LLM blast radius) coupled with significant uncertainty around the macro- and microeconomic implications. Acknowledging this uncertainty, we expect three things to happen as we move toward a world of increased AI and agent usage by organizations and, possibly, a trend towards “AI as the last employee.”

  1. Data and information will be processed much faster, leading to real-time insights and decision support. 

  2.  The metabolic rate of organizations is set to go up as feedback loops from planning to action become faster.

  3. Organizations will face substantially different resource and capital allocation decisions. Both require an orchestration, planning, and decision layer purpose-built for these changing dynamics. 

All of the above requires an orchestration, planning, and decision layer purpose-built for and enabling these changing dynamics. As a result, we see an opportunity to build an AI-enabled business planning platform with substantial optionality to become an integral part of the roll-out and management of increasingly powerful AI systems. Giving organizations on steroids what Cisco called its ‘virtual close’ advantage more than 20 years ago. 


July

Thesis

Data-Driven Infrastructure Management

There is an opportunity for new players to emerge at the intersection of two historically distinct types of businesses: infrastructure inspection and architecture, engineering, and construction (AEC).

January 2025

Industry Transformation

One of our core thesis around AI in the physical world is that novel data generation can drive substantial value creation. Robotics, drones, and sensors are used for inspection to fit right in. Providing customers with high-value (and revenue-generating) inspection services enables unique data collection at scale. As a result, we believe there is an opportunity for new players to emerge at the intersection of two historically distinct types of businesses: infrastructure inspection and architecture, engineering, and construction (AEC). The inspection business generates the data that enables high-value AI-enabled services in the design, construction, and maintenance phases of a project. 

We are interested in investing in companies that have found a unique wedge into the market to build large sets of novel and proprietary data that enable a flywheel of higher-quality services. We believe that the category leader in this space can create an agnostic platform compatible with different robot types from various manufacturers to deliver an increasing range of such services without needing hardware development. 

More effectively managing critical infrastructure assets through technology-enabled inspection, dynamic monitoring, and proactive intervention represents a crucial lever in stabilizing risks presented by emerging security, climate, and energy challenges, promoting public health and safety, and driving more effective capital allocation across the public and private sectors. 

Every four years, the American Society of Civil Engineers (ASCE) releases the Report Card for America’s Infrastructure. It details the condition and performance of the nation’s infrastructure. Its most recent report, released in 2021, gave the United States a C- grade, highlighting a widening investment gap that the ASCE estimates will cost each American $3,300 per year by 2039 (USD 90B+ total). In the years since the report, pressure has increased thanks to challenges imposed by extreme weather events, substantial changes in the global energy mix, and an increasingly tenuous national security situation.

Private infrastructure, from energy plants to commercial shipping, is fighting against the challenges and economic losses associated with system outages. For example, a study by the Lawrence Berkeley National Laboratory and the U.S. Department of Energy estimated that power interruptions cost the U.S. economy about $44 billion annually.

Solving these problems at scale requires moving away from manual inspection and towards more scalable technology-enabled approaches. These are substantially safer and dramatically generate more data that can serve as the foundation for appreciably higher-quality decisions. 

At the same time, public and private asset owners are starting to realize that inspection and data collection ideally begin at the outset of large projects and during construction. That way, decisions can optimized, mistakes can be identified, and one has a digital foundation for future inspections.


July

Thesis

Unlocking Tacit Knowledge Through Constellations of Experts

The relationship between individual and organizational performance has historically been governed by management frameworks – from Albert Sloan's GM to Andy Grove's creation of modern OKRs at Intel.

December 2024

The relationship between individual and organizational performance has historically been governed by management frameworks – from Albert Sloan's GM to Andy Grove's creation of modern OKRs at Intel. These systems attempted to solve the challenge of measuring, improving, and scaling human potential across an enterprise. Yet they remained constrained by the limits of human observation and the difficulty of capturing tacit knowledge – the intuitive expertise that defines mastery of a task but has, thus far, mostly resisted codification.

Over the last 20 years, "game tape" and statistical sophistication have revolutionized athletics (and other highly quantifiable professions like enterprise software sales) by enabling precise feedback loops, accountability, and recognition. AI is now driving a similar transformation of the broader professional universe where the relationship between inputs and outputs is often harder to grasp. Professionals have always valued mentorship and coaching. But access has historically been limited by cost and scale (hence “executive” rather than “employee” coaching. AI promises to democratize this type of performance enhancement (and an organization’s ability to measure it) in the same way that companies like Synthesis address Bloom's Two Sigma problem in education. 

Our hypothesis is that “constellations of (AI) experts” – deployed across every facet of professional development and organizational performance – will become as fundamental to career success as mentors and coaches are to elite athletes today. Several converging catalysts are making this possible. 

  • The mass market deployment of co-pilots and proto-agents has rapidly normalized AI-human collaboration. More than 60% of physicians now use LLMs to check drug interactions and support diagnosis – early evidence of adoption for high-leverage decision support. 47% of GenZ employees say ChatGPT gives better career advice than their boss – signaling dissatisfaction among young workers with the status quo.

  • The proliferation of audio/video capture in professional settings generates rich data to help these systems better understand and improve performance. People increasingly operate with the assumption that every call is recorded, while younger employees regularly go viral for sharing layoff videos online. 

  • The economics of AI are reshaping both organizational structures and individual incentives. Companies are shifting from fixed to variable cost models, flexing labor (human and agent) up and down based on demand.  This, in turn, is shifting how workers are measured and compensated. As a result, professionals must proactively adapt to succeed in this new paradigm where human judgment and AI capabilities become increasingly intertwined.

We see several areas where the “constellations of AI experts” will be professionally valuable. In each of these categories, we expect the most successful platforms will combine automated interactions, human experts in the loop, and content/validation that come together to create holistic systems of improvement. 

  • Organization-wide solutions that integrate deeply with company context to provide AI-powered coaching and feedback loops. While employees have shown a willingness to trade privacy for better tools, trust and security guardrails are essential. 

  • Individual-focused platforms that grow with professionals throughout their careers, combining performance enhancement with credential creation in an increasingly fluid labor market. 

  • Solutions for high-turnover industries that capture and distribute best practices to improve training efficiency and retention (e.g. frontline audio-first interfaces)

  • SMB owner enablement systems in areas like the skilled trades and family medicine, to make it possible to i) capture and transmit tacit knowledge (streamlining hiring/training while increasing terminal value) and ii) help operators compete without needing to bring in expensive consultants or PE expertise

These are, to be clear, highly divergent use cases that necessitate different product philosophies, business models, and competencies from the companies building solutions. However, they share important characteristics, namely that they all represent opportunities to use AI and better data to make professional tacit knowledge, action, and context visible and measurable, unlocking precise intervention to help individuals (and by extension teams and companies) grow into their potential. 

July

Thesis

AI-Enabled Asset Ownership

When to sell to incumbents vs. when to compete.

November 2024

Industry Transformation

For companies deploying AI in legacy industries, a key question is whether to enable incumbents by selling them solutions or compete with them by taking a more full-stack approach. The trade-offs between these two models is something we have started to explore through our AI-enabled services analysis and this piece on when to compete with and when to sell to incumbents in an industry.

Recently, several firms have shared public theses on the opportunity for emerging AI companies (or vertical market software companies) to capture additional value in a given value chain by fully integrating via the acquisition of assets – as opposed to selling solutions to incumbents or taking a more organic (build instead of buy) approach to going full stack.

Slow, which helped finance the $1.6b acquisition of parking operator SP+ by software company Metropolis, calls the model “Growth Buyouts”. Equal Ventures, which recently opined on the opportunity for such a model in insurance calls it “tech-enabled consolidation”. Vertical market software investor Tidemark calls the approach “tech-enabled vertical roll-ups”. Re:Build Manufacturing calls its technology-driven manufacturing roll-up model an “American Kieretsu”. 

Our current hypothesis is that while the AI-enabled acquisition of services businesses (with venture dollars) may not be wise, there is a significant opportunity for emerging AI, software, and robotics companies to capture more value and develop value chain control by acquiring legacy assets in physical industries. 

For decades, venture capitalists have been involved in what seems like Sysyphosian tasks: digitizing businesses that operate assets in the real world. For many reasons, from software shortcomings to incentive problems and structural challenges in GTM to a lack of skilled labor on the consumer side. We see a trend for novel ML models to solve the first of these problems by being able to operate assets end-to-end without much human input. Yet the latter challenges remain. Therefore, AI-native companies addressing these problems are prone to leave value on the table and, due to slow adoption, likely are slower at training and developing their models, preceding a lot of additional value. Therefore, AI-enabled asset ownership represents one path to achieve this. 

Sequence matters for companies that go down this path. Companies should prove they can build technology and deliver ROI (for early customers or via a smaller-scale organic full-stack approach) before embarking on buying distribution via M&A. The only cases where early M&A can be attractive are cases where smaller targets that are structurally very similar to large-scale targets in the market can be acquired for amounts smaller than traditional GTM. Initially, these businesses have venture risk profiles, and only after the second or third large acquisition should they be derisked and become predictable/repeatable enough for investors with a lower cost of capital – Infra, PE, etc. – to consider participating. By reaching this point, venture investors will have seen highly attractive returns. 

Initial Hypothesis on Key Conditions

Below is an initial hypothesis for when it makes sense for a company to vertically integrate via acquisition as opposed to doing so organically or remaining a software/technology vendor to a given industry:

  • The company must have a demonstrated “production advantage,”; i.e., a clear technological or product edge that creates compounding value in an industry. Companies leveraging exclusively off-the-shelf technology likely lack the differentiation to deliver venture-scale outcomes even with strong operational execution and financial engineering. If a PE fund working with Accenture can go after an opportunity or if human labor is cheaper on an efficiency-adjusted basis it is unlikely to be a VC case. If solving the problems requires a combination of world-class technologists AND operators, this becomes an interesting opportunity for venture-style risks and outcomes. 

  • Customers have proven structurally unable to adopt and deploy a company’s solution to its most productive extent. Alternatively, they seem unwilling to pay for the full value of it. This can be due to various reasons, from lack of scale to incentives leading to stasis in a given market (“if my competitor doesn’t innovate, I don’t have to”). We should be able to pinpoint a structural issue – and generally point to evidence from a company’s own experience –  with a given market/customer base to be certain the ineffectiveness is not a product issue. 

  • Building on the previous criteria, companies spearheading the buy-out strategy should be building technology that changes the core way an asset is operated, transforming the economics of the business/industry. Most likely that is where existing operators are (somewhat paradoxly) least incentivized to adopt technological disruption. This is what makes the Metropolis acquisition of SP+ such a compelling application of this approach. SP+ has 3,000+ parking locations around the world where the core customer experience (paying for parking) can be largely automated. While the “work around the work” (maintenance, security, etc.) still requires people, the ROI around the primary transaction is much easier to understand than situations where the AI solution is helping people deliver the primary solution more efficiently (e.g. home services models, legal services, etc.). 

  • Likely, there is a sweet spot around the level of complexity that goes into operating an asset that makes it a fit for AI-enabled acquisition. Complexity can either stem from the core value proposition being complex, several offerings being performed at the same asset leading to compounded complexity or the “work around the work” being significant (e.g., for regulatory reasons). Too little complexity at the core value proposition becomes a PE case; too much and the operational overhead reduces the leverage associated with improving the margins of the core asset. Ideally, the complexity/problems across holdings within the same space should be the same (e.g., parking lots), and skills easily transferable. We should be able to pinpoint these levels of complexity and identify assets/problems where they meet the sweet spot. 

  • The category a company is operating needs to have acquisition targets that are operating at scale (ideally businesses worth USD 1B+ with additional value creation in the several hundred million – further analysis on this needed). Buying assets operating at scale that can be fully optimized and automated via AI is substantially more attractive than rolling up locally-delivered services businesses. Again, this is what makes the SP+ acquisition so attractive, SP+ has 3,000+ parking locations around the world that likely are all run very similarly. Ideally, solutions deliver not only cost savings but also growth opportunities. We are also interested in companies with a view on how the integration of software and legacy assets will unlock increasing ecosystem control and turn the business into an industry operating system. 

  • Companies must have advantaged access to talent across functions. It is rare for a founder or founding team to understand “what great looks like” in areas where they have not had direct experience. A team of software engineers is likely unfamiliar with what makes a great industrial CFO or service-business COO. As a result, we may expect the pool of founders well-equipped to build such a business to be small. We have seen this play out at companies like KoBold Metals, which combine highly scientific founding teams with business acumen. 

These criteria still don’t fully answer why/when it is better to grow a full stack solution via acquisition rather than a more organic approach. One primary reason a company would choose to grow via acquisition is if the geographic footprint and surrounding “infrastructure” of an industry will look largely similar in the future as it does today. In such cases, the speed of distribution created by acquisition is enough of an advantage to overcome the accompanying cultural and organizational complexity that could be mitigated with a more organic strategy.

To use the Metropolis example, should we expect the footprint of the parking industry to be more or less the same in 10 years as it is today? While autonomous vehicles may make some impact on the margin during that time period, the inertia of the built environment probably means we should expect the flow of traffic and parking to remain relatively the same (airports, stadiums, commercial centers, etc). 

A counter-example is H2 Green Steel, which has raised multi-$B to produce steel with 95% lower emissions than traditional steelmaking. Because of the fact that the company’s steel production depended on access to ample clean energy, the company couldn’t just acquire and transform underperforming steel facilities despite the similarity in equipment, permitting, and human capital needs. Thus, to transform the industry around their vision, the company was forced to build a more organic approach. 

Companies also might pursue a buy instead of build strategy when the technology can be easily integrated with existing assets and infrastructure, substantially reducing time to value for a given solution. 

There are likely several other criteria in support of (and against) the strategy of vertically integrating via acquisition which need to be explored in further depth. 

July

Thesis

European Public Safety Primes

With war on its doorstep and American attention shifting to the Pacific, Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century.

November 2024

Industry Transformation

With war on its doorstep and American attention shifting to the Pacific, Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century. But the threat to Europe’s way of life and future prosperity goes beyond front-line kinetic conflict. 

As the threat environment converges, the why now case for companies building for public safety and security in Europe, for Europe, gets stronger by the day. Migration pressures, cyber threats, and cross-border crime require capabilities that existing systems simply cannot deliver. Europe must invest more proactively in innovation across other public safety and security pillars: law enforcement, fire and disaster response, and intelligence.

Across markets, AI is driving a step change in our ability to understand and act upon the physical world. The convergence of AI with real-world data – cameras, drones, satellite imagery, and other sensory inputs – makes it possible to build an intelligence layer that processes complexity at an unprecedented scale. This is opening new frontiers in public safety and security. Companies that can harness, integrate, and analyze this explosion of data to drive faster, safer, and more accurate decisions stand to become category champions and play a key role in forming the foundations for Europe’s long-term growth and strategic autonomy. 

Across the world, advanced policing systems are delivering for forward-thinking law enforcement and border control agencies. Solutions like Flock solve over 700,000 crimes annually, making departments more efficient, while drones drive faster and safer responses. As resistance to US technology persists, expanding EU internal security budgets and increasing evidence that these systems work will push Europe to seek out homegrown solutions.  

Fire and disaster response, helping mitigate the €77b in annual losses from natural disasters and protect human livelihood, represents another critical opportunity area. New entrants combining predictive modeling of weather and the built environment with proactive intervention capabilities will capture value by closing Europe's critical response gaps.

Finally, intelligence agencies are approaching a breaking point: drowning in multi-modal data (from video to financial transactions) while inter-agency coordination fails. Companies that bridge European fragmentation while navigating privacy mandates will become essential infrastructure, enabling real-time response to physical and digital threats.

We see an opportunity for a new class of European "Public Safety Primes" to establish themselves in the market. The Axon playbook – now a $45b+ company built through methodical expansion by expanding from tasers to body cameras to a comprehensive digital evidence platform – shows what's possible. The company has effectively zero penetration in Europe, and local players like Reveal Media and Zepcom remain subscale. Winners will start narrow with a must-have product, earn trust through measurable impact, and expand across the public safety landscape as system-wide advantages compound.

July

Thesis

Composable Companies

Composable companies fuse enduring core expertise with agile, mission-focused teams to rapidly capture diverse market opportunities and redefine traditional business models.

November 2024

Infrastructure Tech

A new organizational model is emerging: the composable company - organizations that blend permanent infrastructure with fluid product capabilities. At their core, they maintain:

  • Capital and resource allocation expertise

  • Shared technical infrastructure

  • World-class talent

  • Possibly, Strategic customer and distribution relationships

By centralizing these unique capabilities, composable companies can swiftly identify, validate, and scale opportunities across its chosen markets. Around this foundation, teams can be rapidly assembled and reconfigured to pursue specific missions/product opportunities with various time scales.

This model excels in markets where opportunity spaces are in flu, and an organization needs to have flexibility and bandwidth to build out ideas that compound together around a coherent view of the future, but might find their manifestation in distinct products for distinct customers.

Recent developments in AI further enhance this model's viability by enabling more cost-effective creation of software and supporting customization for specific use cases:

  • Reducing software development costs

  • Streamlining maintenance requirements

  • Improving customer support efficiency

  • Enabling more cost-effective creation of AI tools

The Resulting Structure

The end product could be a holding company-style enterprise that combines:

  • The above-described core infrastructure

  • Multiple AI product and tools with varying scale and durability

This structure enables the efficient pursuit of numerous opportunities while maintaining the potential for further asymmetric returns from breakthrough successes among them or in aggre


July

Thesis

Marketplaces for AI-Enabled Services

AI-powered, asset-light platforms now empower creators and knowledge workers to build profitable one-person companies that disrupt traditional firms and democratize professional services.

October 2024

Infrastructure Tech

The Rise of One-Person Companies

The unbundling of the firm has been in flight for decades. As the internet enabled increased access to global labor markets, outsourcing to lower-cost countries exploded. The proliferation of cloud computing and mobile took this a step further, making it possible to externalize an increasing number of key operational functions and allowing for more asset-light business models. This prompted a thesis several years ago that the rise of “One Person Companies” remained an underrated opportunity. 

The next step in the evolution of the firm will build on this but will come at the problem from a different direction. It will be defined by the rise of One-Person Companies. Creators and knowledge workers will access external services that provide the capabilities to start and scale a firm and then re-bundle them in unique ways around their core skill set. They will monetize by selling products, services, and expertise to an owned audience that their core skill set has helped them build.

New platforms and infrastructure providers will emerge to support the tens of millions of individuals capable of building successful One-Person Companies along with the billions of consumers and businesses that will support them. More generally, the rise of the One Person Companies will inject dynamism into the broader economy and will play a role in driving more inclusive innovation.

AI – particularly agentic solutions capable of proactively understanding and executing end-to-end business workflows – represents the next leap in this evolution. As several investors and operators have observed, AI is empowering small groups more than ever before and new businesses across the economy (i.e. not just tech startups) are building from inception with AI literacy as a core competency. According to Gusto, roughly a fifth of businesses created last year said they were using generative A.I. to more efficiently carry out things like market research, contract reviews, bookkeeping, and job postings.

Current and Future Market Structure

In complex, non-commodity service categories like strategy consulting, law, investment banking, and wealth management – where key individuals inside of large companies often already “run their own book” – we believe these forces create the opportunity for further fragmentation; i.e the “creator economy-ization” of professional services.

A study cited in a 2015 Forbes article about million-dollar solo consulting businesses indicates this opportunity is not new. 

The U.S. Census Bureau found that of 3.2 million "nonemployer" professional services businesses in the U.S., there were 7,276 firms that brought in $1 million to 2.49 million in revenue in 2013, the most recent year for which statistics were available. And 321 superstars brought in $2.5 million to $4.99 million.

For the sake of simplicity throughout the document, we will refer to these companies as Service OPCs, though there is of course no reason why it must be a single person.

In practical terms, we believe we are entering a period where an even larger number of individuals or small teams with a differentiated skill set or value creation mechanism (product) will increasingly be able to leverage the marketplace (instead of “the firm) for distribution and operational capacity to build profitable and durable OPCs.

This thesis rests largely on the idea that some elements of human judgment are inherently non-scaleable / automatable (similar to our thesis around where value is captured in AI-driven content creation) and thus that the dynamics of the market will tend more towards specialization – thousands of small, profitable “winners” – rather than winner-take-all. 

A services Mittelstand rather than Silicon Valley concentration.

We are interested in what the agentic technologies that drive end-to-end workflow execution will look like and what the coordination mechanism across those autonomous services will be for Service OPCs. Without both of these things becoming a reality in parallel, the overhead of identifying and managing dozens of end-to-end AI agents (some of which will inherently be more end-to-end than others) while growing a client base and playing the most important role of delivering the differentiated service (even if some elements are made more efficient through automation) is likely enough to push promising OPCs back into the arms of McKinsey or Kirkland Ellis.

Effectively, we believe there is a Shopify-like opportunity to “arm the rebels” and build an ecosystem-level operating system for the AI-driven services transformation – combatting empire-building incumbents who view AI as a solidifier of their market positioning and what are sure to be dozens of overfunded venture-backed businesses promising to be “the next Goldman Sachs”.

Product and Technical Hypothesis

By engaging at the aggregation and coordination level, we are interested in answering the question of how a platform might “package” a business around an OPC’s core skill set to help it grow beyond its pre-AI agent potential. 

While we want to avoid being overly prescriptive in our analysis at such an early stage, we believe that for such a platform to represent a comprehensive – and attractive – alternative to the firm for Professional Service OPCs, it would possess some or all of the following characteristics (features), listed roughly in order to how they might be sequenced from a product perspective:

1. Functional Automation (Operational Capacity) – This pillar would serve as an "Agent Store," featuring both proprietary workflows and third-party end-to-end AI agent solutions. It would offer OPCs end-to-end functional agents for various business operations, such as:

  • Contract management

  • Financial management and forecasting

  • Compliance and risk assessment

  • Resource allocation and project management

  • Continuous learning and skill development

  • Marketing and public relations

  • Legal execution

It is also interesting to consider how such a store could provide a distribution channel for third-party developers of specialized AI solutions like Devin (for software development) or Harvey (for legal services) or the seemingly dozens of AI agent companies launching each week (a quick scan of the most recent YC class highlights how prevalent this model has become for early stage companies. 

These developers would be incentivized to use the platform due to its focus on going beyond simply offering access to agents but, helping OPCs “package” a specific set of agents around the skills and ambitions of the company, which brings us to the next pillar of the platform. 

2. Organizational Coordination (The AI COO) – The AI COO acts as the central nervous system of the OPC, ensuring all parts of the business work together seamlessly. Key functionalities include:

  • Automated integration between functional agents (the Bezos API Mandate on overdrive)

  • Workflow optimization across all business functions

  • Stakeholder communication management

  • Strategic decision support

  • Continuous improvement engine for business processes (i.e. vetting and implementing improved solutions or workflows autonomously). 

This pillar is critical in attracting and retaining both OPCs and third-party AI solution providers. For OPCs, it offers unprecedented operational efficiency and is the core enabler of leaving the firm behind for good. For AI solution developers, it ensures their tools are integrated effectively into the OPC's operations, maximizing utility and, long-term revenue potential.

With these three pillars working together, such a platform aims to create a robust ecosystem that not only supports individual OPCs but also fosters a network of AI solution providers. This symbiotic relationship between OPCs, the platform, and AI developers has the potential to drive rapid innovation cycles and expand the market in the same way Shopify has done in e-commerce for physical goods. 

Antithesis

While we have a reasonable degree of confidence that the end state of the unbundling of the firm will look something like what we have laid out above (“Shopify for professional services” is likely a primitive analogy for what we will have in 2050), there are several reasons to be wary of the thesis. Much of this hinges on market timing as well as the common question of whether this will enable truly novel business models to emerge that incumbents are structurally unable to compete with.

  • We may be underestimating incumbent entrenchment, particularly around trust and social signaling, and their ability to adapt. “Nobody got fired for hiring McKinsey, Goldman, etc.”. While not apparently on the operational level (yet), incumbent consulting firms have been among the biggest beneficiaries of the generative AI explosion. 

  • Regulatory, compliance, and legal structures may change slower than the technology,  Sectors like law and finance are heavily regulated. OPCs might face significant hurdles in meeting compliance requirements without the resources and infrastructure of larger firms, potentially limiting their ability to operate in certain (high-value) areas.

  • The complexity of integration (i.e. the AI COO) may be substantially more complex than we have described. The reality of seamlessly integrating multiple complex AI systems could be far more challenging and error-prone than expected, leading to inefficiencies or significant mistakes.

July

Thesis

Benchmarking the Physical World

Standards are the hidden force behind market efficiency, capital formation, and global trade.

October 2024

Infrastructure Tech

Standards are the hidden force behind market efficiency, capital formation, and global trade. From the meter to the shipping container, standards create the shared layer of trust that helps markets function and capital flow.

In 1860, at the outset of America’s railroad frenzy, Henry Varnum Poor published “History of Railroads and Canals in the United States”. This work was the first attempt to arm investors with data on the burgeoning industry and laid the foundations for what is now Standard & Poors — a $100b+ company with $11b in annual revenue. Alongside its long-lived triopoly counterparts, Moody’s and Fitch, it has persisted thanks to powerful standard-based moats that make their frameworks critical infrastructure for global capital markets.

“We think of ourselves as a benchmark company. I mean, data is in vogue now, and people are really kind of a bit obsessed with data and data companies… I think data is nice, it’s interesting. But if you could turn something into a benchmark, it really transcends data.”
SVP at S&P Global, November 2020

As Mark Rubenstein wrote in “The Business of Benchmarking”, universal standards are usually unassailable. The risk for companies that manufacture them is less that their moat is crossed and more that their castle becomes irrelevant. We believe the current geopolitical, economic, and technological moment is creating a once-in-a-generation opportunity to successfully counterposition and eventually (finally!) displace the global ratings and benchmarking oligopoly.

Several forces are converging to create this opportunity. First, Great Power Competition is fundamentally reshaping global trade and industrial activity. The push for energy independence, secure supply chains, and strategic autonomy is driving massive investments in decarbonization and reindustrialization. Reconfigured trade flows and industrial priorities demand new frameworks for understanding risk and opportunities. Second, the growth of sensor networks, connected devices, and geospatial systems has created unprecedented visibility into physical world operations and trade flows. This proliferation of data  – from factory floors to shipping lanes – provides granular, real-time insights that were previously impossible to capture. Finally, advances in AI and machine learning allow us to process and derive meaning from complex, multi-modal data at the scale and speed demanded of modern trade. 

We've seen the fundamental transformation of key commodity markets firsthand through our investment in KoBold Metals. Better collection and analysis of physical world data is revolutionizing resource discovery and development. Meanwhile, geopolitical machinations are accelerating the reconfiguration of global supply chains and trade routes, creating urgent demand for new frameworks to understand and price physical world assets. Traditional frameworks – built for a different era of global trade – are increasingly misaligned with markets that require real-time, granular insights to make decisions.

Success in this market isn't about attacking the incumbent oligopoly directly. Through counterpositioning, the opportunity lies in building for the new industrial economy with a model native to the speed and complexity of modern trade. Winners will start narrow, building density of data and trust in specific verticals, before sequencing alongside their customers' evolving needs to develop new pricing and risk infrastructure for the physical economy.

July

Thesis

Nuclear Supply Chain

Technology deployment cycles – from the railroads to the internet – have long been characterized by dramatic boom and bust cycles. The nuclear cycle is picking up pace.

September 2024

Industry Transformation

Working Thesis

Technology deployment cycles – from the railroads to the internet – have long been characterized by dramatic boom and bust cycles. And while overreliance on analogies is dangerous, these historical precedents provide one lens through which we can contextualize the current moment. 

Today, the AI boom is drawing parallels to the 1990s dot-com and telecom bubbles, with massive capital expenditures, promises of exponential growth, and early productivity gains. On the horizon, the potential of general-purpose robotics even resembles the iPhone-driven mobile revolution that followed almost a decade after the dot-com bust. 

But the differences between the two eras are equally striking. Today, incumbent technology companies possess more structural power over the ecosystem than 30 years ago, suggesting perhaps less overall volatility at the expense of dynamism – i.e. The Magnificant Seven serve as “hysteria dampeners” thanks to massive balance sheets and distribution networks. And while opportunism in AI is certainly present, the technical barriers to entry needed to build a competitive foundation model (and the pace of customer feedback) are substantially higher than building an ISP during the DotCom frenzy.

However, the most consequential difference between the two eras may be the central role of energy – namely the re-emergence of nuclear power – in today's AI boom, particularly with the backdrop of rising great power competition and the ever-present specter of climate change.

Unlike the telecom infrastructure of the dot-com period (and data centers in today's cycle), which serve singular purposes, the expansion of nuclear infrastructure addresses multiple critical challenges. First, it promises to play a significant role in solving for the energy intensity and reliability demands of AI data centers. This is a problem we are looking at from several angles – nuclear and other forms of production, efficiency (via our industrial action layer research), and finally via an exploration of better distribution and grid resilience technologies.  

Beyond serving AI data centers, nuclear power (along with the other categories highlighted), meets the vast need for clean baseload power to accelerate decarbonization and the push for increased energy security amidst heightened geopolitical risk. 

As a result, nuclear’s positioning as a one-size-fits-all solution to many of our most pressing concerns – and thus its (theoretical) resilience to the fortunes of any single macro factor – makes it an attractive “picks and shovels” opportunity perfectly in line with the three major economic supercycles of the next several decades (AI, climate, and national security) – provided the current momentum can overcome generations of cultural and political baggage.

This baggage is complex, equal parts social, economic, and regulatory with each reinforcing the other in a vicious cycle. High-profile accidents and proliferation risk have dominated the social consciousness for 40+ years. This, in turn, influences regulation which increases the red tape related to the development and deployment of safer, more effective modern solutions. As process knowledge is lost and talent leaves the industry, costs spiral even higher and we are left with the current state of affairs. 

Despite its history as a safe, clean, and cost-effective technology, nuclear costs have jumped 33% higher while the cost of new solar generation continues to plummet (over 90%). The latter is, to be clear, a positive – we are pro-energy abundance more than we are wedded to a single approach – and showcases the power of technological learning rates if unleashed. 

Current and Future Market Structure 

Today, the narrative – and the fundamentals – around advanced nuclear are finally shifting towards the positive across several dimensions.

  • Russia’s full-scale invasion of Ukraine provided a clear focal point for the importance of energy security and the role nuclear energy can play in decoupling from Russia. Between 2021 and 2022, the percentage of Europeans at least somewhat in favor of nuclear energy jumped substantially – in Germany from 29% to 44%, in France from 45% to 55%, and in the UK from 39% to 53%.

  • Energy has become the core bottleneck to scaling AI. While GPU scarcity dominated the preceding couple of years, everyone from Mark Zuckerberg to Sam Altman believes the next substantial step forward requires energy-related breakthroughs. As a result, big tech companies have become the “urgent buyers” necessary to drive market transformation. Microsoft’s actions signal a clear belief in the long-term need to expand nuclear capacity. Its recent 20-year power purchase agreement with Constellation, which will revive the Three Mile Island facility, is as symbolically important as it is economically.   

  • The capital markets are responding swiftly to this step change in demand, with financial institutions including BofA, Morgan Stanley, and Goldman backing a COP28 goal of tripling nuclear capacity by 2050. The commitment to triple capacity also came with support from the US, UK, Japan, Sweden, and the United Arab Emirates.

  • Regulatory support has not been limited to COP commitments. In the United States, for example, President Biden recently signed into law the ADVANCE Act, which aims to streamline licensing, promote R&D investment, and contribute to workforce development.

  • This follows on the heels of (tepid) progress on the deployment front. In the United States, Vogtle 3 and 4 are finally online, years late and billions over budget. Still, the finalized designs, established supply chains, and trained workforce should contribute to less expensive future deployment. This summer, TerraPower began construction on its advanced reactor facility in Wyoming. Meanwhile, SMR certification is building momentum in France as companies like NewCleo and Jimmy Energy look to deploy first-of-a-kind reactors by 2030.  

  • Finally, the characteristics of SMR and AMRs coupled with the aforementioned demand dynamics have ignited venture interest in the space. Smaller form factors that can be deployed more flexibly and with a broader range of financing options have eased some concerns about the venture scalability of such projects. As a result, dozens of companies have been funded in recent years. Today, over 100 SMR and AMR designs are being developed at various stages and with different timelines across the world. 

Key Early Assumptions / Potential Catalysts 

The improved environment around nuclear power leads us to a few critical questions, based on important assumptions about where scarcity and abundance sit in the nuclear value chain.

  • Assumption 1 → The timelines for most advanced nuclear projects are at least 7+ years out, likely longer if history is a guide. This may not align with our time horizon unless we can identify intermediate value inflection steps that create opportunities for exit, etc. similar to the life sciences ramp-up.

  • Assumption 2 → The crowded nature of SMR/AMR technology development (abundant capital and attention at that part of the nuclear value chain) and the lack of a clear set of winners should push us to areas of relative scarcity where solving key bottlenecks can accelerate the market overall (i.e. turning those hundreds of companies into customers immediately).

So where is there scarcity in the market that aligns with our investment horizon and approach? Three areas stand out initially, with each meriting a separate deeper dive should conversations with experts, operators, and founders push us in a certain direction. 

Fuel → Russian (and increasingly Chinese) dominance of key fuels and processing steps risks putting the West in a vulnerable position, which echoes the overreliance on cheap gas that contributed to the Ukraine invasion and the European energy crisis. Significant efforts are underway to de-risk the nuclear fuel supply chain. The US Congress passed a bill to limit the import of Russian uranium, while the “Saparro 5” (Canada, Japan, France, the UK, and the US) announced plans to jointly invest over $4b to boost enrichment and conversion capacity over the next three years. 

The biggest risk to decoupling from Russia has been HALEU (high-assay low-enriched uranium), which advanced reactors are being developed to run. Until the Russian invasion of Ukraine, Russia was the only place with a plant licensed to produce this material. Companies like Centrus Energy and a still-under-the-radar Founders Fund-backed startup are targeting this bottleneck, which could be an important enabler of the broader advanced nuclear space. 

Project Development → Over the last two years, much of my work has centered on how to best help “emerging industrial” companies scale more effectively. While my early assumptions were largely centered on the need for new financing models, the critical bottleneck to deploying new projects across energy, materials, and manufacturing often turned out to be capable of on-the-ground project development and execution. Given the deterioration of the nuclear talent base across most Western countries, this problem is even more severe. 

A key problem with effective (i.e. on time, on budget) project development is the fragmentation of the subcontractors needed to build a project end-to-end. Companies are aiming to solve this through a reactor-agnostic platform for nuclear project execution. Through a comprehensive oversight strategy, which includes taking direct control over supply chain management, sourcing, workforce coordination, and financing required for constructing power plant fleets the company hopes to do for nuclear what SpaceX did for launch. Others are building fully modular, factory-made systems innovating on the delivery and development model rather than at the reactor level. 

Waste → Waste remains perhaps the most politically and socially charged aspect of nuclear energy, leading to decades of information warfare despite the relatively boring nature of the problem. Historically, the commercial incentives to store waste weren’t particularly attractive, making it a largely political endeavor. 

Today, countries and companies around the world are starting to see opportunities to turn waste from a burden into an opportunity through recycling.

July

Thesis

AI-enabled Services

We see an interesting opportunity in the context of end-to-end AI service providers. 

September 2024

Industry Transformation

We see an interesting opportunity in the context of end-to-end AI service providers. 

We believe that in certain cases, AI sold as as SaaS product can neither unlock its full potential nor allow developers to capture the value they are creating. This has a few reasons:

  • The limited reliability and lack of guaranteed perfect performance of AI models have led to their positioning as co-pilots rather than end-to-end task performers. A few use cases aside (e.g., coding), we don’t see such end-to-end task performers coming to market anytime soon. This means that successful deployment depends on adoption by a customer’s workforce. Naturally, this makes the ROI of these systems hard to measure and is paired with a sentiment shift that the productivity increases associated with those systems might have been overhyped. The fact that an intangible ROI is running against a very tangible cost of inference for developers does not make this any easier.

  • In a co-pilot world, breaking out becomes even harder for emerging companies. They have a structural disadvantage over established companies that can easily develop and distribute a co-pilot to their existing customers as part of their platforms. This is especially tragic for emerging companies because they require feedback loops and data to improve their algorithms. Without this, they inevitably fall behind the competition in terms of technical capabilities.

  • Pricing models that work in the context of software (e.g., seat-based) don't work in the context of AI, as the focus is often on productivity gains (i.e., getting more done with fewer seats). Therefore, there is a need for value-based pricing.

As a result, we see an interesting opportunity in the context of end-to-end AI service providers. These companies focus on one specific job and guarantee the successful execution and delivery of that job. The way these businesses will look internally is that they will utilize AI as much as possible but have a high-quality domain expert who can jump in if there are issues to ensure successful job delivery. Over time, these companies accumulate substantial proprietary data from “end-to-end” feedback loops of delivering jobs. This holistic approach puts these companies in a unique position to develop best-in-class models for a specific use case, leading to increased automation. In the long-term, the COGS of these businesses will converge toward the cost of computing.

In a lot of industries, professionals are either already using extremely sticky software or the software as it is offered doesn’t fit in the specific workflows (it is reasonable to assume that capitalism has led to every software niche being explored in the past 15 or so years). As mentioned above, many of the companies that have successfully acquired users at scale are already planning to roll out co-pilots as features of their products. For an AI-native company to build out a software suite and spend substantial amounts on customer acquisition is likely not the best use of resources.

This is circumvented by offering the delivery of an entire job and delivering results that are compatible with the legacy software customers might be using. Over time, these companies might decide to build out or buy software platforms on top of their AI-enabled service, interlocking themselves with their customer's processes and generating meaningful lock-in (Nvidia software stack, Marvell IP portfolio).

In the cases where conditions for AI-enabled service to emerge exist (see criteria below), we see this as having potentially three effects on market structure:

  1. Consolidation: Some industries may see a consolidation where AI enables a few large players to dominate by integrating and scaling more effectively than before.

  2. Maintained Concentration: In other industries, concentration may remain the same, but new AI-enabled companies could compete head-to-head with existing large players, potentially reaching similar sizes and profitability.

  3. Fragmentation: Certain industries might experience fragmentation, where AI enables many smaller players to operate independently. This could lead to a need for new platforms or aggregators to organize and leverage these fragmented services.

We think the most interesting venture cases will emerge in the context of 1. consolidation and 2. maintained concentration. In the context of 3, it is interesting to explore the next-order effect of this, (see Marketplace for AI-enabled Services and Products)

Independent of the structural market outcomes, the occurrence of AI-enabled service businesses requires specific conditions to emerge and thrive. We differentiate two types of requirements.

First, necessary conditions are always necessary for these businesses to pursue opportunities. Many of these opportunities, however, may become commoditized, leading to numerous profitable but modestly sized businesses, typically in the $10 million revenue range (i.e., fragmentation).

Therefore, for market outcomes 1. and 2. and venture-scale outcomes to occur, opportunities must have the potential for significant competitive advantages, or "moats." These moat conditions are likely present in only a small subset of AI-enabled opportunities.

Our primary objective is to identify and focus on the most promising categories where both the necessary and moat conditions are met. These categories represent the most attractive opportunities for substantial growth and success in the AI-enabled service sector.

Necessary Conditions

  • Objectifiable Outcomes: Objectifiability in outcomes is crucial to 1) training the models and 2) lowering transaction costs with customers. 

  • Easy to Hand Off from/to customer: Easy hand-off is critical to lower transaction costs and make sure the business can scale without integration work, etc. 

  • Technology Maturity: Technology utilized to deliver services needs to be sufficiently mature, or there needs to be a clear path for technology to be mature. In the beginning, human labor supporting the delivery of services is fine, but there needs to be a clear path to attractive unit economics with close to 90% automation. 

  • Value-based Pricing Possible: The thesis depends on a company's participation in the upside of what is 1) generated at a lower cost or 2) made possible due to a new service. It is critical for the economics that the service provider can sufficiently capture the value generated to ensure top-notch margins that are improving as the technology matures. 

Moat Conditions (at least one or two of them need to be true)

  • Enabling Technology Moats: Using off-the-shelf LLMs will not lead to sustained technology moats. Technology moats will emerge where the quality of the service offered relies on developing and integrating with other types of ML / AI, which will lead to some initial barriers to entry from a technical perspective.

  • Improvements in Technology from Feedback Loops: Building on the previous point, ​​another source of possible moat is that technology improves through feedback loops, as services are delivered to customers and lessons are learned. This means that the market leader can improve its technology fastest, leading to winner-takes-most situations. 

  • Sustained Unique Customer Access: Efficient customer acquisition for a lot of these businesses will be key to delivering top-notch profitability margins in the long term. Those categories where certain types of companies have unique access to customers (e.g., due to industry pedigree) are likely to be attractive. Especially when paired with the technology feedback loop condition outlined. 

  • Liability Underwriting: The assumption is that many of these service businesses will have to take liability for correctly delivering these services. Suppose liability is a crucial aspect of the offering. In that case, the amount of risk that can be taken in this context is a function of the cash a company has on its balance sheet to serve as a buffer for potential failures and can, therefore, be more aggressive. 

  • Regulatory Moat: Benefit from licensure requirements and other regulatory hurdles, which act as a natural barrier to entry and provide a stamp of trust and credibility. However, it is unclear whether this is actually the case. Lawyers require licenses, but barriers to entry are fairly low and based on academic performance. If the underlying models are similar, won’t everybody or nobody get approved? 

  • Brand / Trust: Services businesses inherently are built on a high degree of trust and brand recognition. This trust and brand enable customers to know that somebody can be held accountable and their choice isn't questioned by their bosses or investors (e.g., hiring McK, BCG, or top 4). It is likely that the same dynamics play out here and that this can be a source of sustained competitive advantage.

July

Thesis

AEC Design Tooling

When will we see Figma for the build world?

February 2025

Industry Transformation

Autodesk is a $65B company with 90% gross margins and earnings growth of 10%+ annually over the past decade. It is, in the views of many practitioners in the ecosystem, a monopoly in the worst sense of the word – extractive price increases paired with degrading product quality, closed and proprietary standards that lock in customers, and a lack of feature-level evolution to meet the needs of architects, engineers, designers, and contractors.

But Autodesk is just a symptom of a deeper problem. The entire AEC technology stack has evolved to reinforce silos rather than break them down. Each specialty has its own tools, workflows, and data formats, creating artificial barriers between disciplines that naturally need to collaborate. The result is an industry that remains extremely inefficient – construction productivity has historically grown at under 1% annually despite billions spent on software.

Perhaps counterintuitively because of the stranglehold Autodesk (and other deeply ingrained products) holds, our early hypothesis is that focusing on design is the perfect wedge to transform this massive industry. Every project's design phase naturally brings together architects, engineers, contractors, and owners – each making decisions that cascade through the entire project lifecycle. This, in turn, creates the possibility to develop network effects (the type enjoyed by Autodesk) once at scale.

The question, then, is what creates the wedge for companies to build distribution in the first place and kick off the network effects flywheel – something that has been a challenge for new entrants, as evidenced by the lack of massive venture-backed outcomes to date. We believe several converging technologies are coming together to massively reduce the costs of experimentation, lower the barriers to real-time design collaboration between parties (minimizing the cascading delays that begin at the design phase), and expand the creative canvas of design possibilities.

  • WebGL and cloud-native architectures finally enable true browser-based 3D modeling at scale. Just as Figma used these technologies to make design collaborative, new platforms are rebuilding BIM from first principles for seamless multi-user collaboration

  • Advances in physics-based simulation and generative AI allow instant validation of design decisions - emerging tools can compress structural engineering workflows from weeks to minutes and automatically optimize building systems for performance.

  • New platforms are bridging design and construction by translating BIM models directly into fabrication instructions, creating the potential to significantly reduce MEP installation costs.

We see three approaches emerging to leverage these technologies and begin embedding them into multi-stakeholder workflows:

  1. Next-gen cloud BIM platforms (e.g., Motif, Arcol): Browser-first collaborative design tools – i.e. "Figma for buildings". Here, we believe companies can build momentum through counter positioning – API-first models that prioritize open document and data standards.

  2. AI-powered point solutions (e.g., Genia, Qbiq): Focused tools that dramatically accelerate specific workflows. Genia automates structural engineering analysis and optimization, while Qbiq uses AI to generate space planning options for real estate teams.

  3. Design-to-fabrication platforms (e.g., Stratus): Bridging the gap between design and construction by automatically translating BIM models into fabrication-ready instructions. Stratus has shown particular success in MEP, where better coordination can significantly reduce installation costs.

The path to end-to-end orchestration will follow a clear sequence: Start by connecting architects and engineers through real-time design collaboration. Then extend to contractors, automatically translating designs into construction planning. As the platform becomes the system of record for design and planning decisions, it can naturally expand into procurement, payments, and project financing - using its unique data position to reduce risk and unlock better financial products. Eventually, these platforms could have a shot at orchestrating the entire building lifecycle - from initial concept through operations and maintenance.

Most importantly, these platforms will enable fundamental shifts in business models and incentives. Today's hourly billing and fixed-fee structures actually discourage efficiency - architects and engineers are paid for time and deliverables, not outcomes. But platforms that can measure and validate impact could enable new performance-based pricing models. Early adopters might start with simple metrics like design iteration speed or coordination time saved. Over time, as platforms gather more data across the building lifecycle, they could facilitate true outcome-based contracts where designers and engineers share in the value they create through better, faster, more efficient projects.


July

Thesis

Geospatial Intelligence

The complexity of understanding and managing our physical world is increasing exponentially.

January 2025

Infrastructure Tech

Why is this important?

The complexity of understanding and managing our physical world is increasing exponentially. Climate change creates both acute (e.g. wildfires) and chronic stress on our (aging) physical infrastructure. Supply chains are becoming more intricate and, in the face of geopolitical tensions and the energy transition, are reconfiguring on a global basis in real time. 

Geospatial intelligence – novel physical world data captured via optical, multi-spectral, hyperspectral, and other advanced sensor systems via satellites, ground stations, and other modalities – represents a critical substrate for building the advanced action layers (e.g. truly comprehensive world models) that will power fundamental industry transformation in areas like mining, energy, agriculture, and defense. 

However, the trajectory of the geospatial intelligence market has largely been a story of massive perceived potential and disappointing results for builders, investors, and customers. While the use cases have been evident for decades, commercial value at scale has been slow to materialize and the net ROI of most earth observation companies has likely been negative. Adoption has been broad, but shallow – few commercial customers spend more than $1m per year on data and value-added services related to geospatial intelligence. Leaders on the upstream data collection part of the value chain (like Airbus and Maxar) still rely on government customers for a majority of their business while companies like Planet Labs still struggle to project commercial demand from quarter to quarter, indicating a lack of urgency to the data and analysis being offered. 

Solving the bottlenecks around geospatial intelligence that have kept deep commercial adoption out of reach – namely expensive data acquisition costs (for high fidelity data), fragmented data accessibility, and a lack of connectivity from data to core enterprise/industrial workflows has substantial implications for economic growth and human flourishing. The World Economic Forum projects that geospatial intelligence, as a platform technology, has the potential to drive more than $700 billion in economic value annually by 2030. A vast majority of this value will be created in critical physical industries – transforming land use, mitigating natural disasters, transforming how we build and maintain infrastructure, reducing greenhouse gasses, and addressing security and safety issues more proactively. 

Why is this interesting? 

We believe these bottlenecks are finally beginning to fall thanks to two converging factors – technological step-changes and the emergence of urgent buyers for the key technological building blocks that will make cheap, precise, and actionable geospatial data possible. 

  • Launch costs have fallen 80-90%, enabling massive sensor deployment. While it took nearly 60 years to put 2,000 satellites in orbit, we launched 3,000 in 2023 alone

  • Next-generation sensors are achieving unprecedented coverage and precision. Today's systems can detect not just the presence of objects but their composition and behavior from hundreds of kilometers away, at sub-meter resolution

  • AI and compute advances have collapsed processing times and made it possible to for non-specialists to make sense of multi-modal data – what took human analysts years now often takes minutes

The demand side pull, while still not fully materialized, is equally as important and developing quickly:

  • Insurance companies – and the entire insurance model – face existential pressure from climate-driven catastrophic losses (and regulatory intervention). Beyond risk assessment capabilities, improved, more transparent/accessible tooling can help to rebuild trust in this important segment of the financial system. 

  • Autonomous systems (and with it shorter decision-making windows) are increasingly factoring into defense and intelligence operations, putting a premium on breaking down the current data silos to develop advantaged (precise and real-time) sensemaking capabilities.

  • As we have observed through KoBold, the energy transition is creating entirely new customer segments (and forcing agility from large incumbents) focused on critical mineral discovery, methane detection, and other resource categories like water or forestry. 

  • Infrastructure operators, utilities, and construction firms are scrambling to maintain the trillions of dollars of assets needed to reindustrialize, electrify, and – more critically – simply keep the modern way of life (e.g. clean water) running. Proposed initiatives like The Stargate Project create another major tailwind for the geospatial intelligence market. Above are just the handful of use cases we have been most exposed to through our investments and research. Like most great platform technologies, though, we believe many of the most valuable applications will be emergent. Thus, as we look at investments in the category, we are most interested in companies positioned to surf rather than compete with the dual blast radii of LLMs and Space Launch.

Which areas are most investible? 

Sensor Advantage / Infrastructure → While much of the sensor stack is being commoditized, competition at the powerful world model level (e.g. Niantic’s LGM) will drive demand for truly differentiated imaging and sensor suites. High precision, platform agnostic, high bandwidth, and real-time hyperspectral imaging stand out.

Data Fusion → As launch (and other sub-orbital geospatial sensor deployment) grows exponentially, data generation will scale along with it. If the status quo holds, silos and the need for bespoke solutions will only worsen. There is a Snowflake-scale opportunity to build data warehousing and piping for multi-modal geospatial data.

Geospatial Data as an Industry Transformation Wedge → Similar to Gecko in robotics, we believe the most valuable geospatial companies won’t be thought of as geospatial companies when all is said and done. Instead, we see major opportunities to use geospatial data as a wedge to build the workflows and intelligence engines that transform physical industries.

July

Thesis

Industrial Energy Efficiency

Energy demand is rising for the first time in over a decade thanks to rapid electrification, reshoring of manufacturing, and perhaps most notably, AI.

January 2025

Industry Transformation

Energy demand is rising for the first time in over a decade thanks to rapid electrification, reshoring of manufacturing, and perhaps most notably, AI. This demand is being driven top-down via policymakers and bottom-up from the private sector. Regulations like the IRA and CHIPs Act have driven significant growth in new manufacturing construction. Meanwhile, energy constraints have overtaken GPU availability as the core bottleneck to scaling AI for companies like Microsoft, Google, Meta, and OpenAI. 

The willingness of big tech companies to spend whatever is necessary to access energy in the name of AI has led to amusing re-estimations of future data center energy demand every few months.

“[Our expectation of] 83GW is up from ~56GW from the prior September 2023 modeling. Overall McKinsey now forecasts US data center energy consumption in terawatt hours (TWh), rising to 606TWh in 2030, representing 12% of total US power demand. Critically, this is up from ~400TWh in the September 2023 modeling refresh. This is relative to 147TWh in 2023 and 4% of overall US power demand.”

Meeting this energy demand, whether in service climate objectives or geopolitical, energy, and technological sovereignty priorities, is of existential concern to economies around the world. As the saying goes, there is no such thing as an energy-poor rich country. Europe, in a trend that has continued since Russia invaded Ukraine, continues to struggle to meet the energy-related needs of its industrial champions. This has pushed them in droves to the US and other geographies, putting the continent’s competitiveness, productivity, and growth at risk. 

Energy abundance generally and response to data center demand specifically hinges on three important pillars: new power production, better transmission and distribution, and more efficient utilization.

As highlighted in other research, owning and operating physical assets can provide companies with a tremendous moat and allow them to capture more of the value they create. For this reason, companies focused on new power generation or physical infrastructure related to better transmission and distribution are interesting. However, such opportunities are often held back by factors like permitting that are outside their immediate control. 

Efficiency, on the other hand, is a problem best addressed by software and AI. This is particularly true for commercial and industrial buildings, which account for ~ 20% of final energy use (and rising thanks to the growth factors highlighted above). In some places, like Ireland, data center use alone promises to consume nearly one-third of grid capacity in the near future. As energy costs become a more substantial profitability factor and increased consumption puts pressure on sustainability objectives, better solutions for commercial and industrial energy efficiency represent one of the biggest opportunities of the next several decades.

Many of these operations have concrete optimization functions with goals and constraints. However, in many cases, the degrees of complexity of the world are too large for humans to grasp. Therefore, we fail to set up appropriate optimization functions and systems around them, leading to systems far from their global optima. That’s where we see massive opportunities for reinforcement learning. Advanced RL has enabled us to address areas previously unfeasible to optimize for due to their levels of complexity. 

Managing the energy usage of highly energy-intensive operations (e.g., data centers, cooling facilities, etc.) fits these criteria. RL models are capable of driving significant performance improvements autonomously, saving substantial energy and cost. The team behind Phaidra, one company that applies these models, was started by a team of Google employees who deployed these methodologies at Google data centers and saw up to 40% energy savings. They recently announced that they could drive energy savings of 16% at Pfizer’s data centers. Meta has published similar efforts. 

One of the key questions is whether there is enough quality data from sensors to support these plans and whether there is enough digitization of the physical world (and of its controls) for models to drive actions in the physical world. It is likely reasonable to assume that digitization has penetrated well enough for us to have a reasonable granularity and actionability, but the assumption is that the more data and the more actionability, the better. 

This field sits at the intersection of two areas that are core to our broader AI theses: 

  1. Massive economic will happen in the physical world.

  2. The most interesting AI use cases are in areas where AI helps us develop an understanding of the world where complexity is so high that we previously could not. In previous writing, we have referred to these as “expansion” use cases. 

Moreover, similar to KoBold, we expect that building a company in this space will require hiring world-class people across various fields: 1) AI/ML, 2) Software, and 3) Optimization of niche systems. We believe that for companies, being able to build a company that combines these three talent sources will build up substantial talent moats.

July

Thesis

Spontaneous Software

As LLMs can create software cheaply and agents become skilled at connecting user experiences in novel ways, companies are starting to push ideas around self-assembling/spontaneous software.

January 2025

Fundamental Consumer

As LLMs can create software cheaply and agents become skilled at connecting user experiences in novel ways, companies are starting to push ideas around self-assembling/spontaneous software. We believe that, enabled by LLMs, a new paradigm could be on the horizon that increasingly merges the creation and consumption of software and makes a longstanding vision a reality.

We have previously written about this in the context of business software (see here), but we see an equally interesting opportunity in pro/consumer software and applications. It is important to stress that this is an incredibly nascent area with more questions than answers. 

The few questions we have are: 

  1. Where does this happen? For these experiences to feel genuinely magical and software to feel spontaneous, LLMs must have the maximum context of a user's digital expertise, data, and usage patterns across applications. The most likely place for this to live is within a user’s operating system. Assuming operating systems are too slow to adopt, apps will likely emerge. However, it is unclear how long their staying power will be and how useful these will be if the tools and experience they create/enable are removed and not interconnected with default operating systems. In that case, the default place where these things live could be the interfaces of the large LLM providers. Claude has taken steps in that direction. 

  2. How do these systems' privacy mechanisms work? As described above, they require a lot of context to feel magical. The question is how this context is handled privately. Some approaches mitigate risk, such as private cloud enclaves, but there could be a world where these kinds of applications only start taking off when models have 1) memory and 2) can run on consumer devices (e.g., phones and PCs).

  3. What do monetization and business models look like here? It is unclear how much users will pay for custom software tools, especially if this requires work/creating tools. Only 30% of Android users customize their OS, and the current app paradigm has not trained people that they need to pay for utility-type services (this is the result of a combination of tools as a way to lock in as well as ad-supported models). In a world where apps become cheaper to produce and likely more abundant (due to the same dynamics discussed here), it is unclear whether most users will not just use apps that are increasingly available for niche use cases until software becomes completely self-assembling, assuming users every intent ahead of time. 

If we find good answers to these questions, we will be excited about this space and its potential.  

July

World View

Digital Antidote

Touching grass.

January 2025

Fundamental Consumer

As culture becomes more homogenous and consumption more solitary (a conundrum we wrote about in Airspace and Bubbles), consumers increasingly crave ways to identify with 1) physical brands, 2) physical/ephemeral experiences, and 3) their local/smaller communities and their local brands. 

While this can take many shapes, we see the potential to build a significant business around that and keep our eyes open for them. To give a few examples: 

  • Next-generation sport leagues

  • Stong local restaurant brands and emerging subscriptions, events, etc.

  • Increased inflow into mega-churches that offer smaller group gatherings 

  • Local Fashion Brands (e.g., Bandit)

  • Athlete/chef retreats (e.g., Adam Ondra clinic; Mads Mikkelsen Japan Trip) 

  • Running clubs for dating

  • ...

That being said, there are some structural challenges around how scalable these things are and to what extent they are venture cases.


July

Thesis

LLM-enabled Toys (Care Companions)

LLMs are enabling novel embodied AI use cases.

December 2024

Fundamental Consumer

LLMs are enabling novel embodied AI use cases. We expect that it is a high probability that in 5 years, most toys, from stuffed animals to action figures to Barbies, will have some kind of LLM-enabled voice capabilities. We see a few benefits associated with these LLMs: 

Naturally, we believe that data privacy and safety are crucial to these toys being beneficial and successful. Therefore, we believe them to have the following properties: 

We see an interesting opportunity for a commercial player to emerge here. Specifically, we see an opportunity to build an operating system that meets the standards above and enables owners of IP and distribution to build on top. In addition, we see significant opportunities to extend this platform in other areas, such as elderly care.


July

Thesis

Outcome-Based Pricing

A dominant narrative around how economic models will shift in response to AI is that companies can now “sell the work

December 2024

Infrastructure Tech

A dominant narrative around how economic models will shift in response to AI is that companies can now “sell the work” – replacing seat-based pricing or subscription fees with models more directly tied to the value they create. This trend mirrors the evolution of digital advertising, where sophisticated attribution and optimization layers emerged to maximize and capture value.

Early evidence of this transformation is showing up in software verticals with well-defined (and highly measurable) workflows. In October, Intercom reported that 17% of recent software purchases included outcome-based pricing for their AI capabilities, up from 10% in the previous six months. One customer using Intercom’s “Fin” chatbot, RB2B, said the system autonomously resolved 60% of customer support tickets in August, saving 142 hours of human work. At $0.99 per resolution versus $10 for human handling, this represents both dramatic cost savings and a new pricing paradigm tied directly to outcomes.

As AI capabilities accelerate, we expect a rapid build-out of supporting infrastructure focused on enabling and capturing this value creation and cementing this new economic paradigm. The demand side is already primed – companies face increasing pressure to deploy AI in high-ROI use cases, knowing their competitors (or new AI-native entrants) will if they don't. 

This dynamic is driving the emergence of several distinct outcome-based business models:

  1. Full-stack players aiming to fundamentally reshape the economics of critical industries (particularly those resistant to new technology adoption) represent the purest path to AI-driven outcome-based pricing. Companies like KoBold in mining aren't simply delivering a point solution to an existing value chain – they are using AI to transform how value is created and captured across the entire workflow. In doing so, they take on the full risk/reward that comes with attempting to reorient the economic structure of a deeply entrenched system. Similar opportunities exist in healthcare, where AI-driven approaches could dramatically reduce cost structures while improving patient outcomes, and in commercial real estate, where end-to-end platforms can reshape everything from building operations to tenant experience to energy management.

  2. End-to-end workflow solutions in well-defined/quantitative areas like sales (Salesforce) or customer service (Intercom, Zendesk). Here, we believe emerging AI-native players face a significant uphill battle. Incumbents that cover multiple steps of a company’s workflows have data, distribution, and value attribution advantages while more companies are pursuing internal builds through "spontaneous software" tolling or by leveraging commodity infrastructure (LLMs) to develop custom solutions – as Klarna recently did to great fanfare and apparent success. The company’s OpenAI-powered chatbot is “doing the work of 700 people” as it handles ⅔ of the company’s service interactions. 

  3. Infrastructure players are emerging to accelerate the adoption of outcome-based business models for AI services. We see opportunities for new solutions to handle attribution (measuring AI's impact across complex workflows), market-making (matching AI capabilities to business problems while optimizing for ROI), and financial infrastructure (enabling novel pricing structures). The parallel to mobile advertising is particularly instructive – companies like AppLovin didn't just facilitate transactions, they fundamentally transformed how value was created and measured in their market. These infrastructure players won't just serve existing markets – similar to Stripe in software, they'll expand the opportunity by making it possible for new types of AI services to emerge and scale.

  4. We also expect to see the emergence of teams that develop superior "process power" in AI implementation. Similar to how some organizations mastered lean manufacturing or agile development, these teams will systematically identify industries where AI can collapse cost structures (while maintaining value delivered), rapidly prototype and deploy AI solutions that replace expensive, manual workflows, and build durable institutional knowledge about which AI approaches work best for specific business problems.

    One way of thinking about this opportunity is as a modern version of Rocket Internet or Thrasio, but instead of geographic arbitrage or aggregation plays, they'd specialize in AI-driven transformation of stagnant sectors via an integrated product and go-to-market engine that allows them to capture a commensurate share of the value they create in an ecosystem. Perhaps a more ambitious framing is that a new class of private equity giants will emerge from this paradigm of buying and improving software and service businesses with AI (i.e. modern Constellation Software). 

Unsurprisingly, we believe the most attractive opportunity lies not in incrementally improving existing services with AI, but in fundamentally reimagining how industries operate. This leads us to two areas specifically that we are most intrigued by:

  1. Infrastructure providers enabling more precise outcome measurement, verification, optimization, and value capture across the AI services economy.

  2. Full-stack players who combine AI capabilities with deep domain expertise to fundamentally transform industry economics.

July

Thesis

“Scale as a Service” for the Bio-Industrial Economy

Over the past 25 years, the emergence of "scale-as-a-service" has powered multiple "invisible innovation" waves in the software world.

November 2024

Infrastructure Tech

Over the past 25 years, the emergence of "scale-as-a-service" has powered multiple "invisible innovation" waves in the software world. Infrastructure begets applications begets the need for more infrastructure, and so on. Platforms like AWS, Shopify, Stripe, and Twilio build scale on behalf of their customers in important but non-core, functions and enable access via API. Over time, emerging companies bundle infrastructure from various scale-as-a-service providers, making it possible to go bigger faster or target previously unaddressable niches. Thanks to the programmatic nature of interactions, scale-as-a-service solutions minimize coordination costs and maximize control, enabling precision execution that aligns with a company’s desired speed, scale, and budget.

As scientific breakthroughs make biology more programmable, the why now for Scale-as-a-Service models is approaching a tipping point – but with important nuance. While AI represents a powerful enabler of new product and process design, the reality of biological complexity means we first need better tools and data to model and manipulate processes. As Niko McCarty notes, even the most significant AI breakthrough, AlphaFold, reveals the gap between static prediction and biological reality. Scale-as-a-Service providers can help bridge this gap by industrializing high-quality, standardized data collection across the design and production process. A 2023 study of biomanufacturing bottlenecks found that companies consistently struggle with purification, continuous processing, analytics, and expertise gaps - all areas where specialized infrastructure providers can play a Shopify-like role.

Meanwhile, dominant macro trends like the energy transition and US-China competition are pushing companies and countries towards more sustainable and domestic production models. Half of the world’s largest companies are committed to net zero, “reshoring” continues to grow in popularity on earnings calls, and the Biden Administration has set targets like producing 30%+ of the US chemical demand via biomanufacturing pathways within 20 years.

While first-generation companies like Ginkgo and Zymergen have struggled massively, select second-generation companies like Solugen show signs of staying power. If (still a big if) these next-gen companies prove the economic viability of bioproduction, we expect to see several successful scale-as-a-service providers emerge. These companies will become foundational platforms for trillion-dollar industries like chemicals, materials, energy, agriculture, CPG, and food & ag where bioproduction remains pre-commercial scale. Like the internet, the invisible innovation waves created by this infrastructure application cycle may show that the market for bio-enabled solutions is larger and more diverse than we could have imagined a priori.

We expect most successful scale-as-a-service providers to start with asset-lite approaches. Expanding upon Chris Dixon's "come for the tool, stay for the network" insight, these companies will initially aggregate supply, demand, and attention through useful data and coordination tools. From there, they will evolve into market orchestrators, connecting buyers with sellers and unlocking new capital flows. Eventually, many will build out physical infrastructure at scale, becoming the operating systems of the bio-industrial economy.

July

Thesis

Prediction Markets

Prediction markets represent a fundamental transformation in how society aggregates and values information.

November 2024

Infrastructure Tech

Prediction markets represent a fundamental transformation in how society aggregates and values information. As trust in traditional institutions continues to erode, prediction markets will emerge as an efficient mechanism for pricing risk, surfacing truth, and reshaping decision-making across the economy.

Throughout the history of technology – particularly the internet – important platforms often begin in legal grey areas, where user demand outpaces regulatory frameworks. From eBay to Uber, Airbnb, and Spotify, the most impactful companies solve problems that existing systems cannot address – or, more precisely, where prevailing incentive structures baked into law by incumbents actively resist progress. 

While incumbent resistance will be significant, we believe there is an opening for new mechanisms of collective intelligence that align incentives toward accuracy and accountability.

This transformation aligns with our broader theses around the role of better data (often physical data) driving a shift toward more dynamic and precise information-centric business models. In the same way that pricing for digital tools is evolving from static per-seat licenses to value-based outcomes, prediction markets represent a step-function improvement in how we price and distribute information. Once people experience the power of real-time, market-driven signals – whether in election forecasting or project management – we see no going back to traditional polling or planning systems. Thus, we believe the long-term opportunity extends far beyond gambling or speculation – it's about fundamentally improving how societies and organizations make decisions and allocate resources. 

Amidst the “moment” prediction markets are having in the wake of the US presidential election, critics rightly point to fundamental challenges: the subsidies required to bootstrap liquidity in niche markets may prove prohibitively expensive, and many use cases beyond elections and sports could struggle to attract meaningful participation. While these are serious concerns, we believe they echo historical skepticism of other transformative markets. For example, at the outset of equity markets, stock trading was seen as gambling and was dominated by "bucket shops" where people placed bets on price movements without owning shares. Such activity was seen as zero-sum, manipulated, and socially destructive. Yet over time, infrastructure emerged to make securities trading safer and more accessible: mutual funds, for example, transformed speculation into investment, regulations built trust, and exchanges standardized trading.

A similar story played out in e-commerce. In the mid-1990s, conventional wisdom held that consumers would never trust online platforms with their credit card information. Amazon launched in 1995 to widespread skepticism, yet by creating infrastructure that built trust and reduced friction, e-commerce became not just accepted but essential. 

Our hypothesis is that we are in the 1995 - 2000 period for prediction markets – mass-market cultural awareness is growing and momentum is clear – but market penetration is little more than a blip in the overall picture. In the same way that mobile devices and social networks (among other things) provided the technological catalyst for deeper e-commerce penetration, we see AI (and AI agents) as a critical enabler of the next wave of prediction market growth. For example, by creating liquidity in thousands of micro-markets, AI has the potential to help users take more sophisticated portfolio approaches and contribute to a “utilization cascade” that shifts prediction markets from perceived gambling into new “standard” tooling for information discovery.

Success in this category will likely follow e-commerce's growth trajectory. While early adopters drove initial growth, widespread adoption required infrastructure that created trust and reduced friction. Today's prediction market leaders will similarly need to build both consumer-facing brands and backend capabilities. We expect to see an "Amazon of prediction markets" emerge – potentially Kalshi – that combines direct consumer reach with infrastructure that powers other platforms. This will enable an ecosystem of niche players targeting specific verticals or user segments.

A key question remains around where value ultimately gets captured. Just as e-commerce value accrued to new platforms, internet-native brands, and incumbents who successfully adapted (e.g. Walmart), prediction market infrastructure will create several winning archetypes. Beyond pure-play platforms like Polymarket, existing media and financial platforms that already own distribution – from ESPN to Bloomberg – could emerge as powerful players.

The opportunity extends far beyond any single vertical. By expanding the surface area of possible transactions, prediction markets could enable new types of information exchange that are hard to imagine a priori. Winners will start by dominating specific verticals where information asymmetries create clear value propositions (Books:Amazon::Elections:Kalshi), then expand into adjacent use cases as users become comfortable with the model. Those who can navigate the regulatory environment while building trusted brands will become essential infrastructure for the information economy.

July

Thesis

AI-driven SaaS Replacement

LLMs have started and will continue to bring down the costs of writing software.

November 2024

Infrastructure Tech

As we discussed in many other category thesis, we believe that in the AI era, many of the laws of physics that existed around technology and business models are changing and much has been written about the proclaimed ‘End of Software.’ The argument goes something like this. 

LLMs have started and will continue to bring down the costs of writing software. This leads to a world where software is increasingly created for N of 1 customers and will be easily mendable over time. Ideating and building (or prompting) this software will increasingly shift from developers to non-technical users. 

As software creation becomes cheap, this poses a challenge to traditional software companies whose core value proposition (development, maintenance, and hosting of software, customization, and customer support), business model, and moats are rooted in the ability to leverage initial investments into brands, standards, and free cash flow into features and app ecosystems that catered to a heterogeneous customer base with little incentive to go elsewhere due to switching costs. Switching becomes substantially more attractive if the ‘perfect,’ highly personalized software is the alternative. This fundamentally challenges the business model of these companies. 

With that established, the key question is what the new paradigm might look like. 

There is a vision that if LLMs and agents have access to all our data, software and interfaces will be generated in real-time, on demand, and only emerge when they are needed. Fully in the control of users (or their LLMs/agents), this software costs only as much as the computer required to build it. While this vision is undoubtedly appealing, there are a few missing pieces: 

For one, we assume that it will take some time for models to generate end-to-end software applications. Until this is possible, someone needs to be responsible for ensuring the software works. This is not only from a technical perspective but also from a usability perspective. Just because a feature can be easily built doesn’t mean it should be built. Until models can fully understand context (at which point it is questionable why there would be even a need for human-readable software), domain-specific product expertise will be required to build useful products for specific use cases. Moreover, customer support and the need for enterprise customers to want somebody to lean on when things go wrong will likely remain.  

As a result, we believe there is an opportunity to build a company here. This company will have certain features: 

  • Provide a platform that offers guidelines to non-technical users to create applications for their specific needs 

  • Have an in-house team of developers to guarantee that software is functional when LLMs fail 

  • Create a Platform / App Store-type thing that enables the developer to publish their applications and enable others to use them 

  • Data platform and SDKs that enable matching to good features either developed already or novel and easy integration of these features

  • Business Model: 

    • Initial Development - one-off 

    • (Data) Platform - ongoing 

    • Developer Platform / App Store - marketplace take rate

July

Thesis

AI-Driven CAE

Modern CAE's transformation combines AI and deep learning, drastically improving physical design efficiency, creating opportunities for new hardware-makers, and challenging established OEMs.

October 2024

Industry Transformation

Computer-aided engineering (CAE) tools are the backbone of modern product development. As a result, they underpin much of our modern physical environment. Today, several key catalysts are shifting the playing field and creating long-awaited opportunities for new physical design companies to emerge and scale. 

  1. There is immense potential to commercialize the growing body of relevant research.  Computer-aided engineering (CAE) traditionally utilizes numerical solvers to simulate and analyze complex engineering problems (e.g., Finite Element Analysis (FEA), Computational Fluid Dynamics (CFD), and Multibody Dynamics (MBD)). Depending on the complexity of the problem and the computational resources available, CAE simulations can take anywhere from a few minutes to several days or weeks.

    In recent years, there has been increasing research done in training deep-learning models on simulation data to create so-called deep-learning surrogates. Deep learning-based surrogate models are computational models that leverage deep neural networks to approximate complex physical systems or simulations, providing fast and efficient predictions while maintaining reasonable accuracy (i.e., run complex simulations in seconds). Methods include data-driven (e.g., GNNs, NOs, RNNs) and physics-driven (e.g., PINNs) deep learning and generative models. Technology maturation makes the opportunity ripe for the right team, with access to data, and the ability to learn from a constant feedback loop of testing these models against a verifiable physical reality to push the boundaries of these methods. Combining these into easy-to-use workflows can fundamentally change how engineering (simulation, in particular) is done at scale. 

    An example from research by McKinsey on this: "One company in the power-generation sector, for example, used the approach to optimize the design of large turbines for hydroelectric plants [...] the company reduced the engineering hours required to create a new turbine design by 50 percent and cut the end-to-end design process by a quarter. Better still, the approach generated turbines that were up to 0.4 percentage points more efficient than conventional designs". This is the type of upside that is necessary to get the attention of potential customers in the space. 

  2. We are in the early days of a hardware supercycle. The top-down push by Western economies to reindustrialize and redevelop production capacity in the name of national security, climate change mitigation, and economic growth has driven capital and talent toward physical world innovation. With role models like Tesla, SpaceX, and Anduril leading the charge, hundreds of well-funded aerospace, energy, manufacturing, robotics, and transportation companies are emerging with demands for a modern physical design and engineering stack. This increased competition is pushing incumbent OEMs to experiment with new solutions for the first time.  

  3. AI-native business models create a wedge for new players. A shift in business models has thus far marked the deployment of advanced AI into foundational industries – KoBold owns its own exploration assets while Isomorphic Labs shares in the economic upside of its IP. Similar value-based, consultative, and/or “forward deployed” approaches – a partner rather than software vendor relationship – could create the footing for new players to gain footing with large customers and expand over time, avoiding the all-or-nothing sales cycles that have long protected existing leaders. 

The combination of evolving customer demands, novel research, and new business models have formed the basis for an entirely new paradigm of computer-aided, AI-driven design and engineering tools. They are unlocking faster, cheaper feedback loops, shifting workflows from linear to parallel, and unlocking emergent use cases. This increases both speed and quality in a way incumbents struggle to defend against. 

July

Thesis

The Manufacturing Action Layer

As the cost of adding sensors to anything and everything in the manufacturing process has decreased significantly, the amount of data produced in the factory environment has exploded.

October 2024

Industry Transformation

The anticipated (and much-needed) manufacturing renaissance in the US and Europe – sparked by rising competition with China and a movement to invest in expanding domestic production capacity in the wake of pandemic-induced supply chain breakdowns is hampered by several deficiencies. Among these limiting factors is the challenge of turning vast amounts of disparate industrial data into actionable insights and execution that drive true productivity gains

As the cost of adding sensors to anything and everything in the manufacturing process has decreased significantly, the amount of data produced in the factory environment has exploded. However, conversations with people across the manufacturing landscape make it clear that the impact of production digitization continues to underperform expectations. 

More than a decade into the Industrial IoT wave, most data from the manufacturing process ends up – at best – brought together into static Excel files. And while platforms like Palantir’s AIP promise rapid transformation, the ground reality is that data from different systems continues to live only in the heads of individual operators – a critical risk in an industry with massive turnover and an aging workforce. The VP of Industry 4.0 at a ~ $5b market cap automotive supplier recently remarked that they still lack the visibility to know whether a machine is even running in a factory without calling someone on the ground.

Incumbent software offerings in manufacturing are often stitched together over years (even decades) of acquisitions and integrations, resulting in a mess of fragmentation technical debt, information silos, and process bottlenecks. 

Given this backdrop – and the macro tailwinds that will continue to drive demand for next-gen manufacturing solutions – our initial hypothesis is that there are two interesting angles of attack for new companies: 

  1. Modern industrial control and execution systems capable of aggregating data across modalities and business systems, automating mission-critical operation and production activities, and assuming responsibility (via outcome-based pricing models) for driving efficiencies.

  2. Software-defined manufacturers aiming to displace incumbent manufacturers entirely through more efficient end-to-end approaches in specific verticals/use cases. 

Both models face challenges. The “base rates” for selling impactful digital solutions to manufacturers are mediocre at best and the companies that have reached scale – platforms like Cognite, Palantir, and Samsara – have significant distribution advantages that must be overcome by more narrow emerging entrants. For the “full stack” players, the scale potential is clear but remains to be seen whether venture capital is the right financing tool (“CNC machines with startup branding” is how one person described one of the companies to us).

July

Thesis

AI-enabled PCB Automation

It is a recurring meta-theme that we think AI will have a great impact on the physical world.

September 2024

Industry Transformation

Working Thesis

It is a recurring meta-theme that we think AI will have a great impact on the physical world. At the same time, we are convinced that companies that innovate around business models and take ownership of certain processes will unlock a lot of value, maximizing the value capture associated with their technology. 

One area that has caught our attention in this context is AI-enabled PCB layouting. Printed Circuit Boards (PCBs) are the backbone of modern electronics, enabling a wide range of devices across various industries. In consumer electronics, PCBs power smartphones and smart home devices, enhancing our daily lives. The medical field relies on PCBs for critical equipment like MRI scanners and pacemakers, improving patient care. Automotive applications include engine control units and advanced driver assistance systems, making vehicles safer and more efficient. In aerospace and defense, PCBs are crucial for avionics and satellite communication. Industrial settings benefit from PCBs in robotics and automation systems, while telecommunications infrastructure depends on them for routers and cell towers. From the devices in our pockets to the satellites orbiting Earth, PCBs play an indispensable role in connecting and powering our technological world. As the complexity of end devices increases, so does the complexity of PCBs. 

The increasing complexity of PCB layouts makes the design more challenging due to higher component density and miniaturization, which require intricate placement strategies and precision routing. Managing multiple layers and implementing high-speed interfaces demand careful signal integrity analysis and tighter manufacturing tolerances. Integrating mixed technologies complicates the design process, requiring effective partitioning and thermal management. These factors necessitate advanced skills and sophisticated tools to ensure that designs meet performance and manufacturability requirements. That said, as shown in the table below (Source: Claude), the processes associated with correctly laying out a PCB already take around  50%+ of the total time of PCB development today. We expect this to increase due to the described complexity of PCBs to keep pace with the novel applications we need them for.  

It is our current assumption that increasing complexity will have a disproportionate impact on the effort and time it takes to create these layouts. Other than schematics, this seems to be a very straightforward task requiring little strategic context. A PCB layout either works or does not based on certain benchmarks, whereas schematics can be more ambiguous. We have seen significant progress in AI model development (especially reinforcement learning), which can automate and significantly accelerate parts of the PCB layout process.

The total number of PCB designers in the United States is 72,971, with an average salary of around 74k per year. This gives us a total salary of USD 5.4B for PCB designers. Automating a significant part (70+%) of their jobs offers considerable cost savings. Of course, this does not include any economic benefits associated with AI models accelerating the process and substantially reducing the number of hours. This is especially valuable at the higher end (e.g., aerospace, defense), where PCBs are highly complex and take orders of magnitude more time to design. Acceleration of parts on the critical path into production is likely precious and hard to quantify based on cost-saving numbers.

We have spent significant time thinking about the opportunities of AI-enabled outsourcing and services business and believe that PCB layouting provides the structural environment for such a model to emerge. 

  1. Objective benchmark assessments 

  2. Clear benefits to assuming responsibility for working output

We believe that the company capable of driving significant improvements here can build a large company with a wedge into a market that is otherwise hard to penetrate for software companies due to the dominance of Altium and others. 

July

World View

European Defense

A new era of strategic autonomy and societal resilience

August 2024

Industry Transformation

With war on its doorstep and American attention shifting to the Pacific, Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century. While Russia’s invasion of Ukraine exposed Europe’s lack of preparedness and home-grown capabilities, the conflict has shifted the perception of European builders, investors, and policymakers on the importance (and ethics) of developing and deploying critical technology to foster sovereignty.

The result has been a groundswell of momentum aimed at transforming Europe’s defense-industrial base; protecting European values by deterring Russian aggression in the near term and building the long-term foundations to project strength amid rising Great Power Conflict. 

In recent years, change has occurred at all levels – from the EIB’s updated views on defense technology and the European Commission’s first-ever Defence Industrial Strategy to the rise of an actual defense technology ecosystem in Europe for the first time, catalyzed by the momentum of lighthouse companies like Helsing, ambitious efforts like the €1b NATO Innovation Fund, and grassroots organizations like the European Defense Investor Network.

But expanding markets, increased capital flows, and narrative momentum don’t always imply attractive forward-looking returns. 

Despite the market’s growth, inertia, fragmentation, and protectionism rule the European defense market. While European defense spending has returned to Cold War levels, the continent still lacks urgency relative to geopolitical allies and rivals. The conflict in Ukraine has done little to unite European perspectives on the what, how, and who of building next-generation defense capabilities. The EU’s two largest economic and military powers – Germany and France – remain fundamentally split on the role of Europe in its own defense. This philosophical gap threatens to worsen the severe physical fragmentation of European defense forces – Europe operates 5x the number of vehicle platforms than the US. At the same time, the UK has increasingly shifted attention away from the continent towards the AUKUS coalition. 

The US defense technology ecosystem, far more developed than Europe, inspires little confidence in what lies ahead. Through September of 2023, venture-backed companies were awarded less than 1% of the $411 billion in Defense Department contracts awarded in the government’s fiscal year – only a slightly larger share than in 2010 when few startups were building military technology. And while companies like Anduril have shown that the path to scale is possible, the company’s success may end up making it the new technology distribution chokepoint instead of a bellwether for a thriving new defense ecosystem. 

These factors present significant obstacles to building and scaling European defense technology companies. They may also present unique opportunities for a highly targeted investment approach in the space, aimed at turning the market’s weaknesses (e.g. fragmentation) into strengths and riding key catalysts that may help emerging companies overcome the inertia and sub-optimal industry structure.

Catalysts Creating Opportunity

To believe in the opportunity to invest in emerging European defense technology companies in the face of the incumbent market structure, we need to see significant technological, economic, social, and policy-related shifts that are, critically, available to emerging entrants and not incumbents.

Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century. This has restructured the continent's procurement regimes, capital markets, and attitudes. The simple availability of early-stage capital for emerging defense and security companies in Europe cannot be overstated. With dozens of early-stage funds now focused exclusively or significantly on the space and later-stage investors slowly showing up, financing a defense technology company in Europe to the point of scale is now possible. As the EIB and other capital markets institutions continue to evolve their view, we expect many of the capital markets barriers to financing defense and security companies across the company life cycle will begin to fall away.  

Procurement remains a significant challenge, but the fragmentation of Europe creates opportunities for emerging companies to get to market faster by targeting smaller, potentially more agile countries more inclined to adopt new solutions. Greece, for example, now spends 4% of GDP to support a tech-forward defense strategy while countries close to the front in Ukraine have been forced to move quickly to adopt new solutions. 

The “primitives” for rapid, capital-efficient defense technology development have come online, making it possible for companies to ride multiple technological tailwinds to build solutions that meet the complex needs of government customers. Decreasing costs of hardware, enabled by advanced manufacturing, better (software) tooling, and the acceleration of physical-world foundation models make it possible for companies to develop complex defense technology systems at a fraction of the cost of incumbents. AI systems are already operating successfully in significant tests (like dogfighting with fighter pilots) and on the battlefield in Ukraine, which should drive more receptiveness from (risk-averse) buyers and users. 

Lighthouse companies and talent ecosystems are emerging around defense and national security-focused technology for the first time in Europe. The US defense technology ecosystem built momentum on the back of breakthrough companies like SpaceX and Palantir. The same pattern is playing out in Europe, with companies like Helsing and The Exploration Company forming the foundation for a thriving defense-industrial ecosystem in Munich. While less developed in terms of defense and space-focused technology, places like Stockholm (energy) nd Paris (AI) have become beacons for talent in areas adjacent to national security. Meanwhile, London has captured much of the early-stage energy and likely represents a strong ecosystem to launch a defense technology company thanks to its physical proximity to Europe and cultural proximity to the US.  

The Ukraine conflict has presented a unique opportunity for companies to develop proof points and revenue, creating a “backdoor” for future contracts with Western European governments. It has also highlighted the future of warfare. Rapid acquisition and deployment processes in Ukraine have helped companies generate real revenue and test systems in live situations. While larger Western European governments have been slower to respond, and more likely to simply drive business to existing primes, the proof points being developed by emerging companies should help their cases in (eventually) accessing larger, longer-term programs. Technologically, the predominance of electronic warfare has given a fundamental advantage to agile companies that can iterate rapidly to stay ahead of Russian competition. 

Key Insights

The following factors are the most significant in driving success for emerging European defense technology companies. These lessons are drawn from the company and our customer/expert interviews.

New defense companies are software-first, R&D-centric, and mission-driven. Incumbent defense contractors operate on a cost-plus business model, essentially building to the specifications laid out by government buyers and layering a “reasonable” margin on top (5% - 15%). As a result, large primes spend less than 3% of their budget on R&D and lack the incentive to innovate. On the other hand, companies like Anduril and Shield AI take on product development risk themselves and spend massively on R&D

And while the hardware these companies build tends to garner the most attention, the software and autonomy systems underlying the hardware make everything work. Anduril’s Lattice platform ties together all of the company’s hardware products, fusing sensor data and creating an autonomous operating picture. This software-defined operating model drives better margin structures (Anduril targets 40% gross margin vs. Lockheed and other primes under 20%), allowing them to continue fueling an R&D flywheel.

Fragmentation remains the most challenging aspect of European defense. It may also present the largest opportunity. Europe’s fragmentation challenge needs little additional explanation. There is not one unified military-industrial complex on the continent, there are 27. Each has a different view on supporting its own national champions, different relationships with EU member countries, and divergent views on buying from outside (usually the US). This has resulted in a complex web of disparate capabilities (weapons systems, vehicle platforms, and communication models) that limit rapid response and collaboration.

Understanding this, and realizing that it is likely beyond the reach of one company to solve from a hardware (let alone cultural) perspective, is key to uncovering where opportunities sit. Helsing, for example, has leveraged its positioning as a multi-domain AI backbone to build early leadership around this concept. As cheap drones, cameras, and other sensors proliferate, the opportunities to coordinate the complex data and operational picture, solving capability and collaboration gaps through more modularity and interoperability become larger.

Technology differentiation is table stakes. The most successful companies will possess a “secret” to navigating procurement. Despite the shroud of complexity surrounding defense procurement, success remains largely “champion-driven”, as Anduril CRO Matt Steckman recently remarked. Companies don’t win through better technology, they win by solving specific problems for people with influence in the buying process. Companies must simultaneously engage in long-term relationship building (including lobbying) to build trust with procurement influencers while developing relevant proof points in the field. One way of doing this, as Anduril demonstrated and emerging European players like Lambda Automata are attempting to replicate, is by viewing defense and security as a “conglomeration of micro markets” – which includes adjacent opportunity areas like public safety and border control.

Narrative momentum is highly rated but likely remains underrated by European founders. The traditional stereotypes of European vs. American founders seem to have played out in the early part of this new defense tech wave – from Anduril’s narrative mastery to the momentum of ecosystems like El Segundo to the sophisticated way some companies and investors have integrated themselves into the Washington decision-making systems. As in all markets, there is a reflexive nature to success in defense – the companies that win figure out how to craft a better story and more social proof to attract capital and talent in advance of fundamental traction. 

Distribution bottlenecks inherent in government and defense contracting are already contributing to market consolidation for emerging defense technology companies. Competing against defense primes means eventually competing in every domain they operate in. As software-first companies break through, the returns to scale and breadth might become even greater – platforms like Anduril’s Lattice get stronger as they consume more data and control more hardware assets in the field. Combined with the defense market’s natural bent towards consolidation, companies that can be “first to distribution” in a given area will be very hard to displace and will be strongly positioned to roll up interesting technology and talent, as Anduril has already started to do aggressively. (The sheer number of Anduril references in this document reflects its outsize and rapidly compounding success in this space!)

Emerging Investment Areas

There are several valuable defense market maps and landscapes worth evaluating to understand different ways of breaking up the market, perhaps the most comprehensive is this from Quiet Capital’s Michael Bloch: National Security and Defense Market.

To avoid rehashing those efforts, our focus has been on identifying emerging themes that span multiple segments of such maps, supported by converging market, technology, and geopolitical tailwinds. While not comprehensive, these themes align well with the catalysts and insights above and are where we have seen several of the most interesting companies in our review – the most interesting companies tend to touch multiple themes. 

Modularity and Interoperability → Leaning into the fragmented nature of European defense through solutions that aim to unite disparate operating systems and coordinate complex environments. While software capabilities will be the core connective tissue, hardware plays a big role as well. Cheaper, smaller, interoperable systems built to be easily adopted (both budget and technology-wise) can help accelerate initial deployment and provide companies with a platform from which to expand. 

Rapid Response → Building a more dynamic defense-industrial base by shortening time and cost to intervention across domains and operating areas. This ranges from faster kinetic capabilities (e.g. hypersonics and electronic warfare) to rapid manufacturing capabilities (e.g. Replicator) to faster deployment of machines and people (e.g. counter UAS swarms, labor coordination platforms) to systems that can be deployed (and as importantly, replaced) quickly. 

Multimodal Physical World Data and Intelligence → Wayve’s recent autonomous driving demonstrations showcased the speed at which multi-modal models are making their way into the physical domain. Along with the rapid decline of hardware costs, models that can reason more dynamically create interesting opportunities in defense, where operating environments are extremely fluid (i.e. not repetitive like pick and place, etc.) and thus pose problems for more rigid AI systems. Better simulation data will also continue to play an important role in preparing autonomous systems for live action. This represents a more horizontal theme and is thus something we might pursue a deeper dive into beyond defense.

Software for Hardware → The declining cost of hardware also creates room for better tooling, both at a collaboration/workflow level (i.e. Atlassian and GitHub for hardware builders) and at a design level (i.e. better CAD/EAD, “Figma for chips”, etc.). Fusion, a software platform developed and externalized by space launch company Stoke, highlights the need for better tooling to serve the hardware revolution. Enhanced IP and data security along with high levels of required precision for certain use cases may create specific opportunities in defense.  

Maritime Production, Security, and Infrastructure → Control over maritime infrastructure represents a significant geopolitical and economic advantage. Over the past decade, China has invested heavily in shipbuilding capacity. Today, a single shipyard in China has more production capacity than the entire US shipbuilding industry. However, the importance of maritime control goes beyond just shipbuilding. Undersea cables, for example, are the backbone of the global financial and communications systems. – over 95% of the world's communications are carried by a network of roughly 500 cables laid across the oceans. These represent critical vulnerabilities that need to be proactively protected through better surveillance, kinetic deterrence, and cybersecurity technologies. 

Combatting Digital Authoritarianism → Control of the digital economy is highly centralized, with cheaper data processing and engagement-centric business models (i.e. advertising) feeding the strength of a small number of powerful companies and institutions. This has led to democratic deterioration and a loss of trust in key institutions. It also creates a more straightforward surface area for attack and manipulation by adversaries – spanning consumer-focused influence campaigns to corporate IP theft. Technology that empowers sovereignty over assets and information, increases privacy, and enhances secure communication and collaboration represents a somewhat orthogonal, and bottom-up, approach to investing in defense and security as the go-to-market model may be dependent on large-scale government procurement. 

July

Thesis

“Plumbers” of the Reindustrial Revolution

Like traditional plumbers, these companies are focused on high-stakes problems where failure carries outsized consequences.

January 2025

Industry Transformation

While neo-Primes and OEMs capture headlines and venture capital flows, specialized players solving critical service, infrastructure, and component-level problems will be fundamental to transforming the physical economy. 

We call these businesses the "Plumbers" of the Reindustrial Revolution because, like their namesakes, they occupy an unglamorous but essential (and hard to dislodge) position in their chains. These companies are modernizing playbooks pioneered by industrial giants: Westinghouse in critical components, Bureau Veritas in trust and data, Schlumberger in technical services, and Grainger in supply chain orchestration. 

Like traditional plumbers, these companies are focused on high-stakes problems where failure carries outsized consequences. Their businesses are built first on technical mastery and reliable execution, which fosters deep customer trust and loyalty. Competition remains limited not just because of technical complexity, but through the “niche” nature of their markets – rational actors won't deploy massive capital to displace established players in constrained categories like they might in unbounded markets. This creates a foundation for expansion into adjacent opportunity areas – deepening existing customer relationships or extending technical capabilities to expand TAM over time. 

A key theme across much of our research is how geopolitical competition is redrawing supply lines and catalyzing efforts to rebuild industrial capacity in Western markets. The existential threat motivating this has been a potential conflict with China. But even in a positive scenario where kinetic conflict is avoided – and even as “expected unexpected” events like DeepSeek’s R1 impact the Capex and energy equations – we (and others) believe the trend towards spending on reindustrialization will continue. 

Thus far, the narrative surrounding the Reindustrialization tailwind has primarily benefited companies at the "front end" – next-gen OEMs, new Primes, and companies building brands easily understood by generalist investors that control most of the growth capital in the ecosystem. This is reflected in valuations – the early champions have access to near-unlimited pools of cheap growth capital while earlier-stage players are priced at levels that assume near-perfect execution. While we share the market-level excitement about the new levels of scale the best companies in this market can achieve, we have been more circumspect in our approach to this category.

As competition continues to rise on the front end of the market, our hypothesis is that the most attractive risk-return opportunities will increasingly be found with the "plumbers”, which we see emerging across four primary categories:

Critical Components

Then → Westinghouse's air brake system, invented in 1869 as railway networks reached continental scale, transformed railroad safety and became an industry standard, which created the foundation for one of the largest industrial conglomerates of the 20th century.

Now → The new material, form factor, and communication requirements of modern aerospace and defense systems create opportunities for specialized component makers to become standards in critical subsystems, from wire harnesses to thermal management to energy storage. 

Trust & Data Engines

Then → Bureau Veritas built a global franchise by becoming the trusted verifier of maritime safety standards as international trade expanded rapidly in the 19th century

Now → The confluence of aging existing infrastructure and the need for new development creates opportunity at the intersection of novel inspection technology and data analytics to become the system of record intelligence for asset health, compliance, and built world capital allocation. 

Superdevelopers

Then → Schlumberger became indispensable by mastering the technical complexity of oil exploration and production when the petroleum industry was rapidly expanding into new geographies

Now → The energy transition as well as the emergence of “new Prime frontiers” (e.g. the Arctic and space) creates opportunities for companies that can i) develop proprietary technology suited for challenging environments, ii) develop project execution capabilities to integrate other solutions, and iii) master the regulatory complexity of operating in new areas. 

Supply Chain Orchestration

Then → Grainger was founded in the 1920’s to provide customers with consistent access to motors as both the consumer and industrial markets for automotive and other powered machinery exploded.

Now → Electrification and UAV growth are driving demand for components like batteries, which are largely controlled by China and at increasing risk of tariffs and blockades. This creates new opportunities to build marketplace infrastructure for “democratic supply chains” and better supply chain coordination

Across these different pathways, we think successful companies will share several characteristics:

  1. Natural capital efficiency and organic growth – Sharper focus avoids growth at all capital strategy and expansion plans, fostering a more sustainable model for sequencing market opportunities.

  2. Rational competitive landscape – Perceived (initial) market sizes typically don't justify massive capital deployment by new entrants or existing players, while technical expertise and regulatory requirements create genuine barriers and, in some cases, help companies aggregate a portfolio of “sub-scale monopolies”.

  3. Value accrues to expertise (i.e. Process Power) – Deep knowledge of specific systems, regulations, or technical requirements becomes more valuable as complexity increases and companies either work across a broader segment of the overall value chain or integrate deeper into customer operations. 


1. The EDA market is one of the best examples of this. Companies like Cadence and Synopsys are both ~ $80b and relatively insulated from competition b/c their TAM (as a % of the overall semiconductor market) and their cost (as a % of overall the overall semi conduction design and fabrication process) is small. From NZS Capital:

“As they're successful, they're able to layer on these new businesses that are really additive to the overall business. So they may not even be increasing in price, in a lot of cases, just selling more functionality, because chip designers need it. And it's a really important point to underscore that we're talking about this 550 billion TAM of semiconductors, and the TAM of devices on top of that is another step function. It's being enabled by this sort of 10 billion EDA TAM. It's really small, when you think about what they're delivering.”

“But the idea that more EDA could come in-house over time, it just seems really unlikely to me, in part, because it's just not a huge pain point for the customer. It's 2% of their sales, and they just get so much value for what they're giving, versus the effort to re-engineer all this stuff that's been created over the last few decades.”

2. Much like last decade where being the Uber or Airbnb for X was an unlock for high-priced early financing, the same is true today of companies promising to become the Anduril or Palantir for X.

3. This relates to our thinking on AI-enabled asset ownership/buyout opportunities.

July

Thesis

Transforming Clinical Trials

How can we massively speed up the timeline – and reduce the cost – of bringing new drugs to market?

January 2025

While the interplay of AI and better data is (finally) beginning to deliver on the potential of dramatically expanding the therapeutic opportunity space, these breakthroughs risk being stranded or significantly delayed without a transformation of the clinical trial process.

We believe several factors have converged to create an exciting ‘why now’ for new companies building new clinical trial infrastructre.  

  1. The post-COVID regulatory environment and evolved operating procedures have created a unique window for reimagining clinical trials. 

  2. Remote monitoring, decentralized trials, and real-world evidence have moved from fringe concepts to validated approaches.

  3. The explosion in AI-discovered therapeutic candidates is creating pressure to modernize trial infrastructure for both human health and economic reasons – it is estimated that the cost of clinical trial delays can be on the order of millions of dollars per day

Our initial hypothesis is that winning companies will possess the following characteristics.  

  1. Vertically integrated, building parallel infrastructure instead of patching the existing system. The complexity and interconnectedness of clinical trials mean that point solutions will struggle to drive meaningful change. For n-of-1 companies to exist in this space they need control over the full stack – from patient recruitment through data collection and analysis. This approach is about more than technological self-determination. It also positions companies to innovate on the financial model of clinical trials towards better alignment among all of the key stakeholders (i.e. risk/upside sharing).

  2. AI (and post-COVID) native, designing their processes around modern capabilities rather than retrofitting them onto legacy approaches. This means leveraging AI for everything from protocol design to real-time monitoring while embracing decentralized/hybrid trials and remote data collection as first principles rather than accommodations.

  3. Built to capture the growth of AI-driven drug discovery (i.e. new companies) rather than competing for share in the traditional clinical trial market. This allows them to sidestep entrenched competitors to work with customers operating with the same true north of speed and technical advancement.

July

Thesis

Off-Road Autonomy

Reversing this physical world stagnation represents one of the largest economic opportunities of the coming decades.

January 2025

Infrastructure Tech

The Western infrastructure crisis is about more than aging bridges and roads (and elevators) – it's about our capacity to build, maintain, and modernize the physical systems that underpin productivity, economic growth, and strategic sovereignty. From critical mineral extraction for the energy transition to military logistics modernization to the massive manufacturing capacity needed to achieve reshoring objectives, we face unprecedented demands on systems that have seen little innovation in decades.

Reversing this physical world stagnation represents one of the largest economic opportunities of the coming decades. This is reflected in our work from several angles – most notably our investments in KoBold and Gecko, and through category research into energy infrastructure, sustainable construction, and defense.

It is easy to blame this stagnation on a lack of investment or an absence of vision among industrial (and bureaucratic) operators. But these are symptoms of the fact that physical world modernization – both digitization and automation – is not a monolith and a vast majority of the work that needs to be done is a fundamentally harder problem than commonly understood.

The environments where we have most significantly slowed down and thus where we most need automation – sectors like construction and defense as well as settings like logistics yards – are characterized by high situational diversity: dynamic conditions, variable tasks, and diverse equipment fleets that often stay in service for decades. While continuous process industries like chemicals and manufacturing have made significant strides in automation, these high-diversity environments have remained stubbornly resistant to transformation.

Automating heavy industrial vehicles – earthmovers, mining equipment, military Humvees – represents an important step to mastering these environments and fundamentally transforming the productivity equation in these industries. While much of the discussion around physical world automation has centered on robotics or on-road consumer autonomy (Waymo, Tesla, etc.), these vehicles sit at the intersection by unlocking both autonomous mobility and task execution/manipulation capabilities. They are the workhorses of our industrial system, will continue to be for a long time, and are just now starting to become equipped for autonomous operation. 

"Today you have a few thousand [autonomous] vehicles in mining, you have a few hundred vehicles in ag, you have dozens of vehicles in other verticals. I think we're really at the starting line now. Ag, for example, is nearly 3 million tractors. Obviously only a small percentage of those are big enough or productive enough to be automated. In construction equipment there's a million plus units. You look at something like mining, there's something like 60,000 dump trucks. So those are your upper bounds. But today the biggest successes are in mining where you've got north of a thousand units deployed, which, when you compare to on-road, is in a similar realm." – Sam Abidi, Apex Advisors

Technology Tipping Points → Our robotics research leads us to believe that the category is approaching (or reaching) technological tipping points on several fronts. While on-road autonomy has focused on well-marked roads and predictable conditions, industrial autonomy faces fundamentally different challenges. These environments demand systems that can handle unstructured terrain, weather variations, and complex interactions between vehicles, machines, and humans.

Several technological advances are converging to finally make this possible: 

  • Visual language models (VLAMs) and advanced perception systems that can understand both geometric and semantic elements of complex environments

  • Mapless localization capabilities that enable adaptation to rapidly changing conditions without relying on pre-existing maps

  • Improved sensor fusion that can differentiate between traversable elements (like foliage) and true obstacles while understanding surface characteristics

  • Edge computing architectures designed specifically for ruggedized, industrial deployment

  • Robotic hardware improvements (e.g. dexterity) that can be incorporated into autonomous systems to unlock end-to-end operational capacity.

Talent and Capital Momentum → Along with the technological building blocks for this category, the talent seeds were planted over the last decade as capital and big visions fueled the first wave of autonomous vehicle company building. Frustrated by autonomy regulation and other bottlenecks, founders and engineers started to look for opportunity areas where product roadmaps – and commercial models – could be realized in 2-3 years rather than a decade. This led many to off-road autonomy – despite the much smaller TAM – and has led to a flurry of company formation and funding in the space. 

Investibility – From Action Layer to Systems of Collaborative Intelligence → Building on our thesis in vertical robotics, we see retrofit industrial vehicle autonomy as a powerful near-term lever for modernizing infrastructure. The economics are compelling: retrofit solutions can deliver substantial cost savings versus new autonomous vehicle purchases while allowing customers to preserve their existing fleet investments, which often have 15-20+ year lifespans.

We see a clear sequence for how companies build defensible positions in this category:

1. Action layer as a go-to-market wedge:

  • Target 80-90% automation of common tasks while preserving human oversight

  • Lead with collaborative service model combining autonomy systems with expert support

  • Focus on high-ROI use cases where service model can support end-to-end execution

2. Systems of Record

  • Proprietary datasets around vehicle performance, environmental conditions, and task completion

  • Fleet management and analytics capabilities that span multiple vehicle types/brands

  • Data-driven maintenance and operations optimization

3. Systems of Collaborative Intelligence

  • Coordination and resource planning across operators, vehicles, and robotic systems

  • Serve as integration layer for next-generation capabilities, whether built internally or via partners

  • Consider deeper integration (going beyond retrofitting) to increase system-level advantages


This follows the progression we expect to see Gecko take in the data-driven (and increasingly automated) inspection market and is being proven out now by off-road autonomy companies like Outrider, which has expanded from electric yard trucks using a patented robotic arm to a full suite of site infrastructure and logistics operations management systems. It is worth noting that we believe this same sequencing may not hold when selling to militaries who tend to be more concerned about vendor lock-in and thus less receptive to “operating system” style offerings. 

Still, we believe companies operating purely at the "action layer" will have limited long-term defensibility and will require companies to uplevel their capabilities over time. The path forward also likely includes hybrid models – as evidenced by Caterpillar and Teleo's approach of using remote operation as a bridge to full autonomy, allowing skilled operators to work from anywhere while systematically identifying repetitive tasks suitable for automation.

This progression allows companies to build trust through immediate value delivery while laying the foundation for deeper workflow transformation. The key is maintaining the flexibility to evolve alongside customer needs and technological capabilities rather than forcing premature standardization.

We are particularly interested in companies targeting:

  • Heavy industrial operations (construction, or mining, and agriculture depending on use case) where environmental variability is high but equipment standardization is low.

  • Military and defense logistics, which require operation across diverse terrain with mixed vehicle fleets.

  • Port and industrial yard operations, where dynamic routing and complex interactions between machines and humans are the norm.

This thesis faces two primary risks. First, a breakthrough in robotics foundation models could make the retrofit/incremental approach less compelling, though our discussions with leading robotics companies suggest they are not underwriting dramatic commercial-level breakthroughs on even a ~5-year horizon. Second, growing concerns about AI's impact on employment could spark regulatory pushback, though acute labor shortages in these industries create powerful countervailing forces.

Overall, we believe the combination of sensing, decision-making, and physical execution in high environments represents an attractive wedge to become industrial operating systems in several categories.

July

Thesis

Personal Security

The traditional concept of security, once firmly rooted in the domain of the state, is undergoing a significant transformation.

January 2025

Fundamental Consumer

The traditional concept of security, once firmly rooted in the domain of the state, is undergoing a significant transformation. Individuals are increasingly taking responsibility for their own safety and well-being, driven by a confluence of factors, including rising crime rates, the proliferation of cyber threats, and a growing awareness of the limitations of state-provided security in the digital domain. This shift is particularly evident in the digital realm, where the rise of sophisticated AI-powered scams and the increased abundance of personal data online (both shared knowingly and unknowingly) and its value have created a new era of individual responsibility. We believe that as individuals become more proactive in managing their own security, the personal security market is poised for significant growth, offering a wide range of opportunities for companies that can provide innovative and effective solutions.

This finds its manifestation in the proliferation of data breaches and spam calls, has become a major concern for individuals and businesses alike. In 2023, approximately 56 million Americans lost money to phone scams, with total losses reaching an estimated $25.4 billion annually. These scams often involve impersonating loved ones or authority figures, leveraging highly personal information to solicit urgent financial assistance or sensitive information.

This is exacerbated by the fact that scams and misinformation campaigns will only become more sophisticated from here on forward as they leverage AI-powered voice cloning and deepfake technology. This starts what we often refer to as an evolutionary arms race between the deceiver and the detector. In this environment of heightened risk and uncertainty, individuals take a more proactive approach to their security. 

Moreover, as societies become more polarized, personal information is easily accessible, and doxing becomes more prevalent, we see this sense of perceived risk to also spill over into the real world. 

We believe that the opportunity can take various forms. From cutting-edge digital identity protection solutions to counter deep fake solutions to physical home security platforms, personal security companies are leveraging technology to empower individuals and provide a sense of control over their safety and well-being.

July

Thesis

The Robotics Smiling Curve

Embodied AI reallocates value from hardware to intelligent foundation models and specialized vertical solutions, fueling leaps in productivity across complex tasks.

January 2025

Infrastructure Tech

Where will value flow as embodied AI takes off?

We are convinced that AI, deployed in robotics systems with the unconstrained ability to navigate and interact in the physical world, will be one of the biggest unlocks of productivity and abundance in our lifetime. The convergence of tumbling hardware costs, breakthroughs in AI, and mounting pressure for automation across critical industries has created an unprecedented opportunity for transformation in how physical tasks are performed.

What started 50+ years ago with the optimization of rote industrial tasks has evolved through distinct phases: first, the automation of controlled, repetitive workflows like warehouse pick-and-place operations, and now, the potential to handle end-to-end responsibilities in complex, multi-dimensional environments—from factory floors to healthcare facilities to homes.

This evolution comes at a critical juncture. Labor shortages in key industries, aging populations, and shifting supply chains in response to climate change and geopolitical pressures have created an urgent imperative for modernization. In industrial settings, where ROI drives decision-making, robotics solutions are already catalyzing larger automation budgets. In consumer settings, where emotional factors play a larger role, mounting evidence (e.g. Waymo adoption) suggests growing readiness for automation in everyday tasks.

As with any market opportunity, we are interested in understanding which technological and commercial capabilities are most scarce (and thus most valuable) and along with that, which parts of the value chain emerging companies are best positioned to win. 

Technological Tailwinds

The massive talent and capital flows into robotics over the past few years have been catalyzed by an unprecedented convergence of technological breakthroughs. This convergence is moving robotics from a hardware-centric paradigm (led by companies like ABB and FANUC) to one where intelligence and deep workflow integration capabilities drive market power.

At the core of this shift is the emergence of multi-modal foundation models that sit at the intersection of language understanding, vision perception, and spatial awareness. As DeepMind's Ted Xiao observed in his survey of 2023's breakthroughs, we're witnessing not just technological advancement but a philosophical transformation: "a fervent belief in the power of scaling up, of large diverse data sources, of the importance of generalization, of positive transfer and emergent capabilities."

This shift is backed by technological progress across several dimensions:

Transformer architectures have opened entirely new possibilities for how robots process and act upon information from the physical world. Projects like Google's RT-X and RT-2 and TRI's work on General Navigation Models demonstrate the potential for end-to-end, general-purpose automation of dynamic physical interactions. These advances are particularly powerful in their ability to turn abstract concepts ("verbs") into context-specific actions – understanding, for instance, the crucial differences between opening a door and opening a phone.

  1. The hardware equation is rapidly shifting in favor of commoditization and widespread deployment. The emergence of cheaper, modular components across perception (cameras, radar, lidar), control (motors, actuators), and power systems is making the economics of cognitive robotics increasingly viable. Companies like Unitree are demonstrating how quickly hardware capabilities can advance when paired with improving intelligence layers. Perhaps more importantly, as these intelligence layers improve, robots can achieve more with simpler hardware configurations – a virtuous cycle that further improves deployment economics.

  2. Advances in computing infrastructure, both in cloud environments for heavy workloads and at the edge for real-world autonomy, have expanded the frontier of possible applications. This is complemented by breakthroughs in simulation, synthetic data generation, and cross-embodiment learning that promise to help robotics overcome its historical data scarcity challenges.

However, these tailwinds – and the ability for companies to defend technological advantages – are not evenly distributed across the value chain. For this reason, we believe the Smiling Curve is a useful framework for understanding where and how value will accrue in embodied AI.

In short, we see the most value flowing to i) foundation/world models that can generalize across tasks and embodiments and ii) specialized applications that can leverage these capabilities to solve high-value problems in complex domains. The traditional middle of the value chain – hardware manufacturing and systems integration – faces increasing pressure as intelligence becomes more important than mechanical sophistication. Similarly, data generation labeling, and processing will also face downward pressure as big tech companies with ample access to data seek to drive commoditization to benefit other parts of their business (in robotics and beyond).

This creates two paths through which we believe emerging companies have the biggest advantage in sustainably creating value.

Robotics Foundation Models

Robotics foundation models have the potential to be the operating systems and action layer for the physical environment, transforming commodity hardware into real-world agents.

For RFM companies, we see “data gravity” as a key to success – the ability to create self-reinforcing loops where model improvements drive adoption, which in turn generates more valuable training data. Unlike language models, which could draw on the vast corpus of human-generated text on the internet, robotics models face a fundamental data scarcity challenge. Outside of self-driving vehicles, no one has accumulated the volume of real-world interaction data needed to train truly general models.

This scarcity creates a unique strategic opportunity. A company that can solve the data acquisition challenges through strategic partnerships and deployment models will build powerful network effects. As their models improve, they become more valuable to hardware partners and application developers, generating more deployment opportunities and thus more data – a virtuous cycle that becomes increasingly difficult to replicate.

Vertical Robotics: Deep Integration and Domain Expertise

At the other end of the curve, we see compelling opportunities for companies that can deeply embed robotics capabilities into important workflows in critical industries. These companies succeed not through general-purpose intelligence, but through their ability to solve complex, high-value problems. 

We believe vertical robotics approaches are most valuable where:

  • The workflows governing interactions between robotics and operational systems are highly complex

  • Social dynamics and regulatory requirements favor trusted brands with deep domain expertise

  • The cost of failure is high, creating strong incentives to work with specialists

  • Domain-specific data creates compounding advantages that are difficult for generalists to replicate

Companies like Gecko Robotics (July portfolio company) in industrial inspection exemplify this approach. Their competitive advantage stems not from robotics capabilities alone, but from the domain-specific meaning they extract from collected data. This creates a different kind of data moat – one built around understanding the nuances and edge cases of specific applications rather than general-purpose interaction. It also creates a wedge to expand deeper into a customer’s operations, both via increasingly intelligent workflow tools and more advanced robotics solutions. In addition to inspection, categories like defense & security and construction represent prime areas for vertical solutions to create value. 

Vertical robotics opportunities also force us to consider whether emerging companies or incumbents are best placed to succeed. Despite the massive amounts of capital invested in recent periods in logistics and warehouse robotics, outcompeting Amazon, which has famously externalized many of its cost centers into massive businesses to the detriment of venture-backed competitors, is a tall order. Likewise, consumer distribution and brand advantages held by companies like Amazon and Meta place most new companies at a significant disadvantage.

The Interplay Between RFMs and Vertical Solutions

We also believe there is significant potential for interaction between companies at the two ends of the curve; e.g. Gecko integrating a model from Physical Intelligence. Vertical solution providers can become valuable data partners for foundation model platforms, providing real-world interaction data from high-value use cases. Foundation model platforms, in turn, can help vertical solutions expand their capabilities without massive R&D investment in core robotics intelligence.

July

Thesis

Frontline Audio and Video

Next-generation platforms that combine AI-powered language understanding with advanced audio-video capture are set to revolutionize frontline work by transforming raw field data into trusted, industry-wide operating systems.

December 2024

Industry Transformation

Only a few years ago, while touring the new maintenance facility of a publicly traded aerospace company, an executive pointed out several innovations: automated tool checkout, more advanced safety equipment, and a (physical) file room located closer to the operating floor than ever before. That this last feature was included is telling. The feedback loops between frontline action and data input are central to the operations of many industries – manufacturing, policing, and trade services of all varieties (from plumbing to solar installation). Key elements like pricing estimates, project timing, and resource requirements are often functions of what workers are observing in the field or on the factory floor. 

Despite comprising the majority of the global workforce, frontline workers have been largely left behind by technological transformation. The inefficiencies are stark: law enforcement officers spend up to four hours per shift on documentation, with 96% reporting these demands keep them from core duties. In 2021, nearly ¾ of frontline workers were still using paper forms. But workers are ready to adopt new solutions. In manufacturing 93% of workers believe software tools help them perform better and 96% would be willing to accept increased data monitoring in exchange for benefits like improved training and career development.

The convergence of several forces is creating an unprecedented opportunity to reshape frontline work and fundamentally change how operational knowledge is captured and leveraged. Advances in language understanding mean systems can now adapt to how workers naturally communicate, uncovering deeper context without forcing rigid input structures. Improved video processing and computer vision adds meaning to streaming footage, while ubiquitous mobile devices and sensors enable both active and passive capture (which also contributes to a safer –, hands-free, eyes-up – working environment) . The maturation of retrieval-augmented generation (RAG) technology makes it possible to connect this unstructured frontline data with existing knowledge bases – from maintenance manuals to captured tribal knowledge – creating powerful feedback loops between observation and action.

The winners in this space will build trust by solving acute pain points in documentation and training, then expand to become essential operating infrastructure for their target industries. We see distinct opportunities across market segments. For SMBs – independent trades (“America’s new millionaire class”), farms, medical practices – these solutions can function from day one as a sort of COO and assistant, both improving operations and increasing enterprise value by making tribal knowledge transferable in eventual exits. For larger companies with field forces, manufacturing operations, or driver fleets, these tools accelerate training, surface best practices, and build operational continuity.

In both cases, we believe frontline audio and video capture will serve as the data wedge to become the system of record and intelligence for entire operations. Winners will need vertical focus – the needs of a solar installer differ meaningfully from those of a manufacturer or farmer. Trust and deep industry understanding are critical, as these companies will increasingly look to serve as the automated action layer for their customers, with business models that reflect the value they create (i.e. outcome-based pricing). The platforms that successfully capture and leverage frontline insights won't just become systems of record for individual companies – they'll emerge as the operating systems for entire industries, fundamentally reshaping how skilled frontline work gets done.

July

Thesis

Precision Wellness

Better health outcomes—delivered at lower costs and with greater accessibility—are fundamental to economic growth and human flourishing.

December 2024

Fundamental Consumer

Better health outcomes—delivered at lower costs and with greater accessibility—are fundamental to economic growth and human flourishing. Preventative healthcare represents our largest lever to unlock better outcomes at scale. However, the centralized control, opaque incentives, and high friction that characterize today’s healthcare system hold back progress. It is not built with technological advancement in mind and fails to meet the standard of experiences consumers have elsewhere. 

As the prevailing model fails to evolve, a new paradigm— precision wellness—is emerging. This transformation mirrors the forces that transformed media, finance, and commerce by redistributing the power over the experience to individuals. From top-down institutional mandate to bottom-up iteration, from one-size-fits-all solutions to hyper-personalization, from controlled to in control.

The wellness-driven consumer is at the center of this shift. Motivated by the same “divine discontent” that has continuously sparked consumer innovation across the economy, their demands for scientific rigor and an elevated user experience are accelerating the growth of the precision wellness opportunity. 

  • The next phase of GLP-1 adoption, perhaps the most important catalyst of this overall opportunity, appears increasingly driven by consumer-centric companies; 

  • The vast array of cheap, passive sensors integrated into phones, watches, and headphones creates longitudinal data that was previously unavailable, while clinical-grade modalities on consumer devices build trust in health-focused technology and reorient expectations toward continuous, rather than episodic, monitoring and intervention; 

  • The "mainstreaming of biohacking" is evident in the adoption of CGM among non-diabetics, the growth in advanced biomarker testing, whole genome testing, full-body MRIs, and the increasing demand for personalized, science-driven health optimization protocols.

As more people experience the feedback loops of better health and deeper health understanding – for themselves and those around them – their engagement compounds. This flywheel effect, combined with traditional healthcare's eroding monopoly on trust and access, creates a strong why now for emerging companies capable of integrating science, technology, and brand building. 

We also recognize that precision wellness has a significant blast radius effect, with aggregators, especially Apple, at the center. Data gravity, vast resources, and an incentive to commoditize complementary solutions make it unwise to compete directly. Thus, we are most interested in companies building non-device-centric models for distributed discovery, diagnostics, and delivery. This includes:

  • Next-gen healthcare providers integrating novel diagnostics and data collection into full-service care delivery (going beyond simply digitizing traditional models).

  • Knowledge networks (content + community + coaching) that use personalized insights to help guide users through specific niches of their precision wellness journey, creating a layer of trust in a consumer area that can be overwhelming due to a low signal-to-noise ratio.  

  • Companies using biological insights, often via at-home testing modalities, as a wedge to build up proprietary data sources, trusted brands, and communities.

July

Thesis

Energy Grid Data

I mean, data is in vogue now, and people
are really kind of a bit obsessed with data and
data companies.

November 2024

Industry Transformation

As we move to more renewable energy production and electrifying our consumption, the problems we must solve to modernize the grid are becoming more complex. This need is further amplified by strong increases in demand associated with increased data centers for AI. High-quality data is a crucial infrastructure to ensure that we understand the electric grid and make the most impactful decisions when operating it and investing in it. We believe there is substantial value in unlocking access to such quality data, from avoiding grid outages due to overload to increasing the ROI on making maintenance and new investment decisions.

At the same time, there are substantial issues associated with accessing quality data on the U.S. power grid: 

Fragmentation
The grid is divided into regional entities, such as the Eastern, Western, and Texas Interconnections, managed by various utility companies, independent system operators (ISOs), and regional transmission organizations (RTOs). 

Lack of Standardization
This fragmentation leads to diverse data sources and inconsistent reporting practices, making compiling comprehensive, high-quality data​ difficult.

Non-centralized energy sources
Additionally, the rise of distributed energy resources (DERs) like solar panels and electric vehicles adds complexity. Data on these resources is often fragmented and incomplete, complicating grid balancing and forecasting efforts​.

Privacy and security
Concerns around this restrict access to detailed grid data, as releasing such information could expose vulnerabilities to potential threats​.

While several initiatives (e.g., NREL, IEA) by various government agencies and NGOs to address the abovementioned challenges have been underway, none have impacted the market where easy open data access has been achieved. 

Therefore, we see a unique opportunity in a dedicated effort to aggregate the various data sources and make them available in a standardized format via API and application. The availability of such data can be the underpinning for a wide array of new applications and use cases that require such data (e.g., applying Reinforcement Learning-based optimization to the grid) and can be substantially improved if such data exists. In short, we see an exciting opportunity for the company that can aggregate and maintain the highest quality grid data to be the nexus of an emerging ecosystem.

July

Thesis

Nature Intelligence

We have been inspired by the field of digital bioacoustics ever since being introduced to this field through Karen Bakker’s work.

November 2024

Infrastructure Tech

We have been inspired by the field of digital bioacoustics ever since being introduced to this field through Karen Bakker’s work. We believe there are a few factors that drive the emergence of this field. For one, sensors are becoming smaller and cheaper while edge processing and memory capabilities increase. The broadened availability of these sensors has led to an increase in domain-specific physical world data – a recurring theme in categories we get excited about – that can be coupled with complementary data sources. Coupled with algorithmic breakthroughs, this data can be used in a host of interesting cases: 

  • Biodiversity monitoring: We believe that biodiversity is a crucial cornerstone of a climate-resilient ecosystem and world. Tracking biodiversity in a cost-effective and accurate way has a clear ROI for a host of different shareholders. Bioacoustics augmented with different data sources seems to be an attractive way to achieve this. We see an opportunity to create an objective standard around this kind of data that can be critical to unlocking the emerging commercial ecosystem.

  • Optionality in collecting novel Nature Data: As we collect more data about our ecosystems, we will see emergent use cases for this data. 

    • We see a world where enough data on ecosystems is collected so that we can predict the trajectory of an ecosystem and take the measures/actions to maintain it. Potentially, this could enable the fast regeneration or creation of novel and healthy ecosystems from scratch.

    • Building more sophisticated bioacoustic models can allow us to develop a more granular understanding of the natural world (e.g., tracking the healthiness of individual plants or animals vs. entire ecosystems), which will drive novel use cases in agriculture and beyond.

    • We have been excited about human-to-animal communication for a while and have been following the work that organizations like the Earth Species Project are doing. While concrete use cases will likely only emerge as we develop these models and understand their capabilities and limitations, proven applications such as navigating bees and deterring elephants from entering farms already show promising signs of impact and ROI.

    • As followers of the Santa Fe Institute, we are convinced that interdisciplinarity in building complex systems is conducive to human advancement. Developing a deeper of nature’s complex ecosystems to inspire our manmade systems in novel ways holds a significant upside. This is the core thesis behind our investment in Sakana AI.

    • We see the potential for bioacoustic data to resonate with consumers. For example, users could listen and interact with ecosystems (e.g., their local forests).

We see an exciting opportunity in an orchestrated commercial effort to bring the research from recent years into the field and deepen our understanding of nature and the positive upside that comes with that.

July

Thesis

AI Movie Workflow Suite

AI video content creation will likely diverge into two paths.

November 2024

Industry Transformation

AI video content creation will likely diverge into two paths: high-quality productions that capture and create wider cultural moments, and lower-quality, personalized content. Consumers are expected to value both types, making tradeoffs between production quality and personalization based on their needs.

High-Quality AI-powered Content – We believe that world-class creative talent is attracted to tools and places that enable them to realize their creative ambitions. Given AI's economics and possibilities in the creative process, it will become an indispensable tool for the best creators. We appreciate that AI models today cannot, on a standalone basis, generate world-class content on par with Hollywood-grade. We believe that the foreseeable future will require holistic tools that enable outstanding creative talent to tell great stories with captivating visuals. Therefore, we see a unique opportunity to marry the capabilities of the most advanced AI models (across relevant layers) with an interoperable software and workflow suite. 

We believe there is substantial economic value and options associated with successfully building out such a suite:

  • An AI-powered suite can wedge its way into a software market that has seen little innovation. As talent makes the availability of such solutions a key choice in who to work with (e.g., studios), most major studios will likely have no choice but to adopt the solutions demanded. If played correctly, such an AI-enabled suite can replace existing tools and, over time, set new standards.  

  • We see opportunities to selectively go end-to-end and enable the build-out of a full-stack AI-enabled movie studio/production company. 

  • We see substantial opportunities to expand into other mediums (e.g., gaming).

Low-Quality AI-powered Content – On the other side of the spectrum is lower-quality, highly personalized, rapidly produced content that can be generated by small creators and, ultimately, by the user (either actively or passively based on preferences). This will not require dedicated workflows with large consumer aggregators of consumer(e.g., Netflix, Meta, YouTube, etc.) but instead will be captured by companies uniquely positioned to democratize easy access to video generation models, automated content aggregation, and distribution.

From a venture perspective, we are especially excited about the opportunity associated with the former but believe there will be large companies built in the latter where emerging companies can identify and engage high-value niches that fall outside the core focus of existing platforms (e.g. sports).

July

World View

Consumer AirSpace and Bubbles

There is a palatable sense that we are in for a major disruption of the way we currently spend our time and money.

October 2024

Fundamental Consumer

Working Thesis
There is a palatable sense that we are in for a major disruption of the way we currently spend our time and money. There are a few underlying trends (some of them might appear at odds with each other):

Consumers are increasingly living and consuming in two spaces that are drifting apart: 

Increasingly homogenous AirSpace
Globalisation and innovations in mass production and marketing gave rise to global consumer brands and the first wave of a globally flattened culture. The internet put this on steroids - the same memes, music, and clothes are available almost instantly everywhere. The experience economy, initially a backlash against this homogenisation, has been commoditised. Uber from the airport to your similarly designed Airbnb, whether in Miami, Mumbai or Marakesh. Scale wins, and to achieve that scale you have to work with social media and search engine algorithms, which tend to surface the most mainstream goods and content (because it is the least risky and most profitable), thereby reinforcing that mainstream for consumers. The same is happening in film, where studios are increasingly focusing on mainstream features. We use the term AirSpace coined by Kyle Chayka for this phenomenon of increasing homogeneity.  

We expect the emergence of generative AI to further reinforce the unification of mainstream content. By definition, these algorithms probabilistically create the type of content they are expected to develop based on their training data. As the cost of creating generative content comes down, this will create massive amounts of predictable content that fits squarely into AirSpace and lacks the unexpected. 

Increasingly Heterogenous Personalized Bubble
At the other end of the spectrum, there is a strong trend towards individualised content consumption. Due to the abundance of on-demand content (e.g. Spotify, Netflix), there is a shift towards consuming content on demand and in a highly personalised way. While there are benefits to this type of content consumption, it also makes the content that each of us consumes predictable, as our individual consumption preferences are understood and reinforced by recommendation algorithms. 

As a result, our shared cultural fabric, which is an important medium through which we connect with each other, is being eroded. For example, in its final season in the late 90s, Seinfeld was consistently the number one show on television, averaging 22 million viewers per episode, who watched the episode simultaneously and discussed it in the office the next day. In 2023, the most watched show was Suits, which premiered in 2011 and had its final season in 2019 - we saw it come up in zero conversations in 2023.

We expect this to increase as AI-generated content becomes increasingly viable. We see a not-too-distant future where content across all media and potentially all levels of quality is created for an audience of N of 1, highly tailored to each individual's preferences. 


What we believe to be true about the human psychology and sociology
People like trends and the comfort they bring. So AirSpace is not bad and will continue to exist. However, there is likely to be little room for innovation; large aggregators exist (e.g. Meta, Google, Airbnb) and will continue to monetise this in the best possible way.

Humans like to consume the content they enjoy, and that reinforces their bubble. The more personal, the better. Hence, the Personalized Bubble is not bad. We expect this to get much weirder from here as application developers and consumers lean into AI-powered use cases. Character AI was chasing this opportunity, but a team of former Google researchers was unlikely to embrace the weirdness. 

People like to consume authentic, unique things. However, much online content lacks authenticity/quality/care and is predictable. Gen AI is the straw that breaks the camel's back as the cost of content creation trends towards zero (or the cost of computing). 

As a result, there has been a noticeable shift in how large parts of our digital lives are moving either to group chats (which can act as a curation layer for the noise) or back to IRL in the case of dating (e.g. running clubs in NY or supermarkets in Spain). We also see this shift playing out beyond content and relationships. We believe that people have an innate desire to consume goods that others have put care into and that are unique. As this type of content becomes less present/prominent online (e.g., due to Gen AI), we expect to see a big shift towards people consuming physical goods and experiences that have this artisanal aspect, are unique or ephemeral, such as pottery, handmade clothing, leather goods, live concerts, etc. This is great for brands like Hermes, which have kept craft at the heart of their leather business. It's also great for live performing artists (and their ecosystem), local artisans, etc. 

Humans crave shared cultural experiences. Unexpected and rooted in whatever shared cultural fabric is left, these experiences must shatter the confirmatory AirSpace and transcend our personalized Bubbles. Achieving this in a repeatable fashion requires a deep understanding of the Zeitgeist and the ability to turn this inside out in unexpected ways that deeply resonate with a society's (or sub-groups) shared cultural fabric. 

Opportunity Areas
Substantial innovation will occur in the context of AI-enabled personalized experiences. We are excited about this opportunity and are looking for companies exploring envelope-pushing form factors and ideas that are borderline fringe today.

As the Airspace and the Bubbles continue drifting apart and becoming more homogeneous on the one hand and heterogeneous on the other end, there will be substantial value in creating these types of experiences in a repeatable fashion. Studios like MSCHF and A24 have done a great job of this.

July

Thesis

Intelligence-Enabled Marketplace

We see an exciting opportunity for AI-enabled marketplaces to emerge.

October 2024

Infrastructure Tech

Working Thesis

We see an exciting opportunity for AI-enabled marketplaces to emerge. While there are a lot of opportunities for AI to enhance marketplaces (see good NfX write-up here), we are especially interested in situations where AI-enabled processes are in a reinforcing interplay with data advantages that lead to a sustained higher value proposition (i.e., better matching) in the marketplace (see graph below).

As outlined above, there are two interconnected feedback loops at play: 

  1. Using LLMs and VLMs to collect the right proprietary data at scale (i.e., conduct interviews, ingest large documents, understand client requirements using document upload, etc.).

  2. Use fine-tuned LLMs/VLMs + other ML models to understand demand and supply better, identify actions that reduce uncertainty around matching probability (e.g., follow-up questions) and to carry these actions in service of enabling more cost-effective/higher-value matching.  

We expect that businesses creating sustained value in this space to meet the following criteria:

  1. LLMs, VLMs, and other models can perform tasks to an acceptable degree (i.e., they meet a bare minimum threshold) – both on the proprietary data collection and matching side.

  2. Large amounts of unstructured data and feedback loops are useful for fine-tuning models that directly unlock economic value.

  3. Nobody has collected data relevant for training/finetuning these models at scale as there has been no economic/technological incentive to do so.   

  4. There are ways to create highly frictionless form factors using 1.)  that allow users to interact with these platforms seamlessly and in highly personalized ways to collect large amounts of data. 

  5. Initial data and model advantages can be sustained and turned into lasting moats with little risk of second movers and other market participants (e.g., incumbents with large distribution) being able to catch up. 

We see opportunities in various areas, from HR to Traveling to healthcare provider (e.g., psychologist) matching. Especially in scenarios where a lack of information leads to low matching rates. A few cases:

Case Study 1: Staffing

Staffing is historically incredibly time-consuming, requiring a deep understanding of a candidate’s capabilities and the job requirement assessment. This is very hard to scale as quality assessment usually requires 1) reviewing materials, 2) conducting interviews to dig deeper + review these, and 3) feedback cycles to understand what type of candidates the demand side actually wants (stated vs. revealed preferences). This leads to many staffing marketplaces doing a bad job of vetting demand or being very expensive, with matching rates reflecting this. 

Let’s go through the criteria set up above to see whether a hiring marketplace is a good fit to become intelligent:

  1. LLMs can already review and synthesize vast amounts of unstructured data (e.g., CVs, websites). They are capable of doing the same with job requirements. They are also capable of performing job interviews to a somewhat satisfactory level. 

  2. Models and AI interviews can be finetuned based on desirable outcomes (e.g., matching of demand and supply), thereby adjusting their reviewing and interview capabilities. This can happen even in a customized way (e.g., custom), given that certain parties on the demand side are large enough to guarantee a certain “offtake.” Mercor wrote this on their blog:

  3. This part is not so clear in the context of staffing. For one, there are a plethora of existing and new AI-enabled hiring tools that use AI-supported video (e.g., HireVue), and existing staffing platforms (e.g., Upworks) are rolling out video interviews, too. It is unclear to what extent these platforms might or might not have large amounts of a combination of unstructured data with hiring matches that they can use to train better models. Also, by sheer scale and distribution, these platforms should be able to generate plenty of data easily. 

  4. In the segments of the economy where jobs are sought after, people are eager for the opportunity to be in the talent pool that is considered for specific jobs. In these cases, people are willing to share their data CVs and conduct AI interviews – especially if the process is smooth. Given that the demand side (aka the companies looking to hire from the talent pool) is reasonably attractive, the CAC associated with acquiring the supply and data (i.e., video interviews, CVs, etc.) should be fairly low. 

    As described above, while we don’t assume AI-based matchmaking to be perfect yet, we believe that AI can be used to support already increasingly efficient matching, enabling the development of a cash-flow-generating business model while data is collected and models improve.

  5. Given the dynamics described under 3, it is unclear whether an HR marketplace with an initial data advantage can sustain this advantage. What if existing platforms like Upwork roll out AI-based video interviews and start training their models? With their existing brand and supply, they should be able to generate more data than any startup substantially faster, leading to better models, etc. If not, what is a relevant quantity of data to establish a platform as the winner? Will general LLMs acquire the capabilities of finetuned models as they get better and context windows improve?

July

Thesis

Sustainable Construction

Construction is one of the world’s largest industries.

September 2024

Industry Transformation

Construction is one of the world’s largest industries. Global construction spending in 2023 amounted to some $13 trillion, 7% of global gross output. It is also one of the most unproductive sectors of the economy. Tight labor markets, regulatory complexity, and systemic fragmentation along with cultural inertia have contributed to stagnation and a lack of technological penetration

This ineffectiveness does not discriminate by project size or scope. While nearly everything we touch and consume is produced in mass quantities, factory-produced homes still make up a small percentage of the overall new housing stock. Meanwhile, 98% of mega-projects experience cost overruns of 30% or more, and 77% face delays exceeding 40%. The impacts on broader economic growth are significant. Had construction productivity matched that in manufacturing over the past 20 years, the world would be $1.6 trillion – 2% of GDP – richer each year. Increasing pressure to decarbonize places additional stress on the low-margin, change-resistant industry. Through both operations (28)% and materials/inputs (11%), buildings account for nearly 40% of global emissions.

These supply-side deficiencies come against a backdrop of rapidly expanding demand – by 2040, the industry needs to expand production capacity by 70%+. This is creating a desperate, and long overdue, search for answers that we believe can only be met by a combination of technological innovation and novel production and business system design. 

While prior attempts to transform construction – most notably Katerra – have failed, several factors are converging to create a more compelling why now moment. Novel materials like green steel and low-carbon cement are approaching commercial viability, while mass timber innovations make building faster and less wasteful – while delivering significant carbon sequestration. Construction robotics focused on autonomous assembly, logistics, and data capture can address the labor gap. Perhaps most importantly, advances in generative design and AI-powered collaboration tools can help target the small but critical coordination inefficiencies that have historically bottlenecked progress – precisely the type of system-wide improvements that Amdahl's Law suggests are essential for meaningful transformation.

We believe the companies that capitalize on this moment will do so through one of two models. The first is selective vertical integration – controlling critical capabilities in materials, design, and manufacturing, but executed with greater focus and patience than previous attempts. The second is a platform approach that centralizes key material and system design and standardizes interfaces between stakeholders while allowing specialized players to focus on their core competencies – similar to how semiconductor manufacturing evolved.

Both models recognize three essential elements that must work together: First, standardized approaches to next-generation materials that maximize both assembly efficiency and carbon benefits, from green steel to mass timber. Second, digital infrastructure that enables true system-wide optimization and seamless stakeholder coordination. Third, controlled manufacturing environments that bring automotive-style productivity to strategic components, whether owned directly or orchestrated through a network of partners.

July

July

News

Synthesis School raises USD 12M

04/21

USD 12M to build an education system for collaborators.

theme shape animation
Core Theme

Fundamental Industry Transformation

We pay close attention to categories where cutting-edge software (e.g., AI) is fundamentally changing and improving processes - often in the real world. This is a recurring theme across industries, and we invest in companies critical to bringing about fundamental industry transformations.

News

Sakana AI raises USD 30M

01/16

Sakana AI raises USD 30M to develop nature-inspired AI.

Research

Brett Bivens

Brett is Head of Research at July Fund. Brett is based in Annecy, France.

Portfolio Founder

Hélène Huby

Hélène is CEO & Co-Founder at The Exploration Company. Helene is based in Munich, Germany.

News

Mining Company Is Silicon Valley’s Newest Unicorn

06/20

KoBold has raised USD 195M from existing investors to accelerate our AI-backed search for critical minerals vital for preventing the most catastrophic impacts of climate change.

Partner

Jonathan Jasper

Jonathan is a General Partner at July Fund. Jonathan is based in New York City, USA.

Partner

Philipp Schindler

Philipp is Chief Business Officer at Google and a Founding Limited Partner at July Fund. Philipp is based in Mountain View, USA.

Portfolio Founder

Chrisman Frank

Chrisman is CEO & Co-Founder at Synthesis School. Chrisman is based in Los Angeles, USA.

Berlin, Germany

clear sky 4°C

News

The Exploration Company raises USD 160M

11/17

The Series B round will fund the continued development of the Nyx spacecraft, which will be capable of carrying 3,000 kilograms of cargo to and from Earth.

News

EUR 40M For Reusable Space Capsule

02/03

The Exploration Company Raises €40m For Reusable Space Capsule Platform, NYX.

News

Sakana AI raises USD 100M Series A

09/03

Sakana AI raises USD 100M Series A.

Partner

Florian Schindler

Florian is a General Partner at July Fund. Florian is based in Berlin, Germany.

News

The European Space Agency chooses The Exploration Company

05/22

Body awards contracts to two companies to develop cargo services to the International Space Station.

World View

Hardware Development Tooling

Enabling the physical technology supercycle

February 2025

Infrastructure Tech

Our ongoing exploration of the hardware development stack, from AI-driven CAE to PCB automation, has consistently pointed us toward a fundamental challenge: the immense complexity of coordinating diverse tools, stakeholders, and workflows across the hardware development lifecycle. While individual design tools have evolved, the job of orchestrating these pieces – managing requirements, test data, manufacturing handoffs, and team collaboration – remains a major bottleneck.

As Western economies pour unprecedented capital into hardware innovation across aerospace, energy, and defense, an entirely new class of hardware companies is emerging. And they are building with fundamentally different expectations around tooling and development speed. The incumbent hardware solution stack fails to meet these heightened expectations – it is fragmented across systems, heavily manual, and lacks real-time visibility. 

As a result, we have seen many emerging hardware companies rolling their own solutions to solve internal and external orchestration across the development lifecycle. Stoke Space’s Fusion, an internal tool that they externalized, is one such effort. This trend, which we have seen inside of several other companies, signals both the severity of existing tooling gaps and validates demand for better solutions.

As such, we see a Stripe-like opportunity to enable and capture a portion of the value created by this new class of companies through the type of critical, but boring, infrastructure that we have deemed “plumbers for reindustrialization” in other research.

We see three primary areas of opportunity for new companies at the orchestration layer:

Test Data & Observability: The proliferation of sensors and testing equipment has created data noise that existing tools can't handle effectively. Real-time analysis of test data, coupled with AI for anomaly detection and optimization – DevOps-like telemetry and monitoring – could transform validation processes that historically relied on manual review and tribal knowledge.

Unified Data & Collaboration Hubs (Next-Gen PLM): The shift toward distributed engineering teams and expansive supply chains (e.g. via outsourcing) has exposed the limitations of current tools. Engineers spend a material amount of their time on non-value-added work like converting files, updating documents, or searching for the latest designs. Modern cloud-based hubs that unify product data (requirements, CAD, tests) could dramatically improve productivity.

Manufacturing Orchestration: The gap between design and manufacturing is a major bottleneck. Tools that automate the translation of designs into manufacturing instructions and provide real-time feedback on manufacturability could significantly reduce iteration cycles and costs.

New platforms built specifically for these emerging workflows – distributed by default, data-intensive by design, and automation-ready from the start – are naturally advantaged.

From a go-to-market perspective, focusing on emerging hardware companies helps orchestration companies avoid legacy processes and tooling and instead focus on shaping modern development workflows. These companies are building complex hardware under intense time (and capital) pressure – they need tools that can keep pace. As these tools prove their value to early adopters, they can expand both vertically (serving larger enterprises) and horizontally (connecting more parts of the development process). 

However, this means our thesis relies on this new class of hardware companies being a durable and scalable customer base. If the end game is dozens of sub-scale acquisitions and a select few successes – leaving today’s incumbent hardware OEMs as the primary market – the entrenchment of existing tooling and orchestration companies (from Siemens to Jama to PTC) will be harder to break.

Similar to what we have concluded in our research into AI-driven CAE, success doesn’t require displacing incumbent tools outright. Rather than competing head-on with entrenched CAD/CAE/PLM systems, new platforms can focus on making these tools work better together – becoming the connective tissue that coordinates modern hardware development. Once established as coordination layers, these platforms position themselves to expand their footprint over time.

The PLM and hardware development tooling market can already be measured in the tens of billions, but we believe the truly transformative companies will win by expanding the market and helping hardware companies iterate and build at the speed of software. This acceleration creates a powerful flywheel: faster development cycles enable more products, which drives increased tool usage and data generation, further improving development speed. Just as software development tools expanded their market by enabling faster iteration cycles, we believe the winners in hardware orchestration will grow the market by unlocking new levels of development velocity.

The risks are real – long sales cycles, integration complexity, and regulatory requirements in sectors like aerospace and defense. But we believe the confluence of market demand (driven by reindustrialization), technological convergence, and incumbent blindspots create a unique opportunity for new platforms to emerge.

July

Thesis

LLM Application Deployment: Resilience and Optionality

Today, the deployment of generative AI solutions into the enterprise, particularly large companies, has started and often exceeds expectations.

January 2025

Infrastructure Tech

The generative AI era presents an interesting paradox – strong confidence in the directional arrow of technological progress (the ever-expanding and evolving LLM blast radius) coupled with significant uncertainty around the economic implications, both macro and micro. Today, the deployment of generative AI solutions into the enterprise, particularly large companies, has started and often exceeds expectations

At the same time, there is wide agreement that while early applications are driving positive ROI, most organizations face a significant change management problem in fully incorporating them into existing operational frameworks - “there are no AI-shaped holes lying around.”  For many enterprises and their executives, this has led to a “flight to trust,” with large consulting firms benefitting from enterprise appetite to utilize generative AI. This uncertainty around future enterprise workflows is also furthermore reflected in the observation that most AI startups that have found traction have done so with an anthropomorphized approach, “selling the work” in areas like legal support and accounting – essentially building an end-to-end AI replica of what customers have come to expect from human + software. 

While we think great business can be built there, this can’t be all. We believe that as organizations and society develop a better understanding of AI, we build core workflows around this new paradigm, constantly adapting existing organizational structures and coming up with entirely new ones. We broadly differentiate between resilience and optionality and believe that both areas provide opportunities for interesting models and platforms to emerge. 

Resilience focuses on enabling existing companies forced to adopt AI (in secure) and effective ways to stay competitive. As described above, these companies already have processes and employees. Both might have a hard time adopting. 

As with any complex system, we believe there is a unique opportunity by looking at the smallest unit in organizations - employees. While executives and consultants try to implement AI policies top-down, high-agency individuals (armed with ChatGPT, Claude, Cursor, and the latest tools) are constantly discovering productivity enhancements built around their idiosyncratic workflows, often utilizing these tools without explicit permission. We see an opportunity to push much of the work of identifying enterprise-specific best practices to these forward-thinking individuals and for a novel platform focused on this bottom-up approach to AI resilience to emerge.

In the process, such a platform could kill two birds with one stone. It provides a starting point for better data controls and security processes to manage risk while helping companies understand the financial implications (productivity improvements, cost structure changes, etc.) of continued AI deployment. 

Furthermore, monitoring and visibility in AI use by employees help enterprises gain insight into best practices (making AI fit into existing holes) that can be rolled out across the organization. The big opportunity that emerges from this wedge and model for enterprise trust building is that such a platform positions itself as we move toward a world of “spontaneous software” and possibly, “AI as the last employee” – similar to how Workday came to define ERP for the “digital transformation” era. 

Optionality focuses on building companies around novel organizational structures with a view to the upside, native to AI workflows and not possible before. 

This is an extension of what we previously wrote on “spontaneous software and SaaS on-demand”. In line with a recent post from Nustom that draws parallels from the autonomous vehicle market to propose the idea of a self-driving startup, we believe that there is a massive opportunity here for companies that operate like mobile game studios, making use of the reality that software is increasingly cheaper to write and startups cheaper to run with AI getting better and more capable at both. We expect these companies will excel at rapid experimentation and iteration, consistently positioning themselves ahead of the capability curve to try to catch lightning in a bottle (hits-driven) or/in combination with being long-tail driven with a large number of small cashflow-generating businesses under one roof. 

July

Thesis

Digital Olfaction

Some of AI's greatest opportunities lie in its application to understanding and transforming the physical world.

January 2025

Infrastructure Tech

Some of AI's greatest opportunities lie in its application to understanding and transforming the physical world. We believe in the potential of convolutional neural networks, GNNs, and transformers to help us deal with this complexity and make sense of the world in ways that we have not been able to (we internally call these "expansion" use cases). This theme runs through several of our investments, most notably KoBold Metals. 

We believe that digital olfaction and a better understanding of how molecules make us smell are among those areas. Scent, due to its complexity, is our least understood sense. Novel approaches to machine learning, such as GNNs, have been proven to cut through this complexity and beat the human nose at scent profile classification based on molecule structures. Osmo, the company at the forefront of this research, has proven that it can utilize this understanding to develop novel scents. It is reasonable to assume that this technology will enable the faster development of novel molecules at lower cost and at scale. 

In 2023, the global aroma chemicals market was valued at approximately USD 5.53 billion (unclear if this also includes produced chemicals vs. only IP). This market is essentially dominated by a few large players:  Givaudan, Firmenich, IFF, and Symrise. All these players are fully integrated, meaning they both develop scent molecules (IP) and manufacture them. It is unclear how much value is in the pure IP, but some tailwinds could favor the emergence of a novel AI-enabled player focused on novel IP. In 2023, a class action lawsuit was filed against major fragrance companies Givaudan, Firmenich, IFF, and Symrise. This followed multinational (Switzerland, Europe) antitrust investigations into suspected price-fixing by these companies initiated earlier the same year. Moreover, there is a marketable shift in the industry to focus on sustainable molecules that don’t require scarce resources and have no negative health effects. 

The ability of AI to generate such molecules that either have novel scent profiles or are similar to existing ones without having negative externalities (e.g., associated with production health) is likely a unique fit for these AI models. We expect that to create maximum value, such a model (or suite of models) would likely need to be capable of 1) the ability to model molecule interaction to create a whole scent, 2) an understanding of constraints (e.g., toxicity, costs) and 3) the ability to assess the producibility of these molecule sets at scale. 

Moreover, we see substantial potential for market expansion. Suppose these AI systems are capable of identifying, mapping, and predicting the behavior of scent molecules, given certain hardware advancements (essentially a chip capable of detecting, analyzing, and recreating scent) are made. In that case, several new application areas emerge: These can span from environmental monitoring to medical diagnostics, where AI can detect disease biomarkers through molecular analysis to consumer applications such as capturing, reproducing, and sharing scent online. White is hard to quantify, but it is reasonable to assume that there is substantial option value.

July

Thesis

Space Debris

The number of objects we are sending to space is growing exponentially.

January 2025

Infrastructure Tech

The number of objects we are sending to space is growing exponentially. Thanks to SpaceX, launch costs have fallen 80-90%. While it took nearly 60 years to put 2,000 satellites in orbit, we launched 3,000 in 2023 alone. 100k satellites are expected to launch by 2030, marking a further increase in the complexity of space operations. 

As old space assets deteriorate and more are launched, collisions are inevitable, particularly as a result of space debris. Satellites already make 100,000+ maneuvers per year for collision avoidance. Losses due to collisions in low earth orbit were estimated at ~ $100m in 2020. Since then, we have at least tripled the satellite population.

While responsiveness is improving (e.g. edge GPUs to enable on-board autonomy), hundreds of billions of dollars in assets are and will be exposed without i) precise monitoring, ii) proactive defense systems (beyond trying to move out of the way), and iii) adequate financial risk management (i.e. insurance models). 

While it is easy to forget amid the current space market momentum, the industry is still walking a fine line – something that seems to have motivated Elon’s all-in effort in the direction of Donald Trump’s election. As the nuclear industry has demonstrated, the public perception pendulum in highly sensitive industries can swing toward the negative (for decades) with only a small number of high-profile failures. Space debris is the type of problem that, left unchecked, poses a Three Mile Island-style risk for the industry. Also like most “waste” related problems, there is often not a strong enough incentive for any single actor to solve it until it is too late. 

The fragmented incentives and control mechanisms in addressing space debris are evident in the current regulatory frameworks. 

The United States is a patchwork of policies, with agencies like the FAA, FCC, and NASA each taking different approaches to limiting or removing space waste. Europe’s approach has been more comprehensive with the European Space Agency (ESA) developing the "Zero Debris Charter," aiming to prevent the generation of new orbital debris by 2030. As of October 2024, 110 countries or entities have joined the charter, and discussions are ongoing with major satellite operators for participation. 

Despite these initiatives, the absence of a binding international legal framework leads to a "tragedy of the commons" scenario, where individual actors may lack sufficient incentives to invest in debris mitigation (opting instead to accelerate commercial advances amid increasing competition, resulting in increased collective risk.

International cooperation around debris is also threatened by geopolitical posturing. Without better visibility and defense mechanisms, nation-states will always have plausible deniability around the destruction of important satellites and space infrastructure (“it wasn’t us, it was debris”). Since even 1mm fragments of debris can disable a satellite entirely, this is not too much of a logical leap.

We believe that solving the problem of space debris creates an interesting wedge for companies to eventually become critical infrastructure for space security and risk management.

July

Thesis

AI-enabled Business Planning

Giving organizations on steroids what Cisco called its ‘virtual close’ advantage more than 20 years ago. 

January 2025

Industry Transformation

The generative AI era presents an interesting paradox: strong confidence in the directional arrow of technological progress (the ever-expanding and evolving LLM blast radius) coupled with significant uncertainty around the macro- and microeconomic implications. Acknowledging this uncertainty, we expect three things to happen as we move toward a world of increased AI and agent usage by organizations and, possibly, a trend towards “AI as the last employee.”

  1. Data and information will be processed much faster, leading to real-time insights and decision support. 

  2.  The metabolic rate of organizations is set to go up as feedback loops from planning to action become faster.

  3. Organizations will face substantially different resource and capital allocation decisions. Both require an orchestration, planning, and decision layer purpose-built for these changing dynamics. 

All of the above requires an orchestration, planning, and decision layer purpose-built for and enabling these changing dynamics. As a result, we see an opportunity to build an AI-enabled business planning platform with substantial optionality to become an integral part of the roll-out and management of increasingly powerful AI systems. Giving organizations on steroids what Cisco called its ‘virtual close’ advantage more than 20 years ago. 


July

Thesis

Data-Driven Infrastructure Management

There is an opportunity for new players to emerge at the intersection of two historically distinct types of businesses: infrastructure inspection and architecture, engineering, and construction (AEC).

January 2025

Industry Transformation

One of our core thesis around AI in the physical world is that novel data generation can drive substantial value creation. Robotics, drones, and sensors are used for inspection to fit right in. Providing customers with high-value (and revenue-generating) inspection services enables unique data collection at scale. As a result, we believe there is an opportunity for new players to emerge at the intersection of two historically distinct types of businesses: infrastructure inspection and architecture, engineering, and construction (AEC). The inspection business generates the data that enables high-value AI-enabled services in the design, construction, and maintenance phases of a project. 

We are interested in investing in companies that have found a unique wedge into the market to build large sets of novel and proprietary data that enable a flywheel of higher-quality services. We believe that the category leader in this space can create an agnostic platform compatible with different robot types from various manufacturers to deliver an increasing range of such services without needing hardware development. 

More effectively managing critical infrastructure assets through technology-enabled inspection, dynamic monitoring, and proactive intervention represents a crucial lever in stabilizing risks presented by emerging security, climate, and energy challenges, promoting public health and safety, and driving more effective capital allocation across the public and private sectors. 

Every four years, the American Society of Civil Engineers (ASCE) releases the Report Card for America’s Infrastructure. It details the condition and performance of the nation’s infrastructure. Its most recent report, released in 2021, gave the United States a C- grade, highlighting a widening investment gap that the ASCE estimates will cost each American $3,300 per year by 2039 (USD 90B+ total). In the years since the report, pressure has increased thanks to challenges imposed by extreme weather events, substantial changes in the global energy mix, and an increasingly tenuous national security situation.

Private infrastructure, from energy plants to commercial shipping, is fighting against the challenges and economic losses associated with system outages. For example, a study by the Lawrence Berkeley National Laboratory and the U.S. Department of Energy estimated that power interruptions cost the U.S. economy about $44 billion annually.

Solving these problems at scale requires moving away from manual inspection and towards more scalable technology-enabled approaches. These are substantially safer and dramatically generate more data that can serve as the foundation for appreciably higher-quality decisions. 

At the same time, public and private asset owners are starting to realize that inspection and data collection ideally begin at the outset of large projects and during construction. That way, decisions can optimized, mistakes can be identified, and one has a digital foundation for future inspections.


July

Thesis

Unlocking Tacit Knowledge Through Constellations of Experts

The relationship between individual and organizational performance has historically been governed by management frameworks – from Albert Sloan's GM to Andy Grove's creation of modern OKRs at Intel.

December 2024

The relationship between individual and organizational performance has historically been governed by management frameworks – from Albert Sloan's GM to Andy Grove's creation of modern OKRs at Intel. These systems attempted to solve the challenge of measuring, improving, and scaling human potential across an enterprise. Yet they remained constrained by the limits of human observation and the difficulty of capturing tacit knowledge – the intuitive expertise that defines mastery of a task but has, thus far, mostly resisted codification.

Over the last 20 years, "game tape" and statistical sophistication have revolutionized athletics (and other highly quantifiable professions like enterprise software sales) by enabling precise feedback loops, accountability, and recognition. AI is now driving a similar transformation of the broader professional universe where the relationship between inputs and outputs is often harder to grasp. Professionals have always valued mentorship and coaching. But access has historically been limited by cost and scale (hence “executive” rather than “employee” coaching. AI promises to democratize this type of performance enhancement (and an organization’s ability to measure it) in the same way that companies like Synthesis address Bloom's Two Sigma problem in education. 

Our hypothesis is that “constellations of (AI) experts” – deployed across every facet of professional development and organizational performance – will become as fundamental to career success as mentors and coaches are to elite athletes today. Several converging catalysts are making this possible. 

  • The mass market deployment of co-pilots and proto-agents has rapidly normalized AI-human collaboration. More than 60% of physicians now use LLMs to check drug interactions and support diagnosis – early evidence of adoption for high-leverage decision support. 47% of GenZ employees say ChatGPT gives better career advice than their boss – signaling dissatisfaction among young workers with the status quo.

  • The proliferation of audio/video capture in professional settings generates rich data to help these systems better understand and improve performance. People increasingly operate with the assumption that every call is recorded, while younger employees regularly go viral for sharing layoff videos online. 

  • The economics of AI are reshaping both organizational structures and individual incentives. Companies are shifting from fixed to variable cost models, flexing labor (human and agent) up and down based on demand.  This, in turn, is shifting how workers are measured and compensated. As a result, professionals must proactively adapt to succeed in this new paradigm where human judgment and AI capabilities become increasingly intertwined.

We see several areas where the “constellations of AI experts” will be professionally valuable. In each of these categories, we expect the most successful platforms will combine automated interactions, human experts in the loop, and content/validation that come together to create holistic systems of improvement. 

  • Organization-wide solutions that integrate deeply with company context to provide AI-powered coaching and feedback loops. While employees have shown a willingness to trade privacy for better tools, trust and security guardrails are essential. 

  • Individual-focused platforms that grow with professionals throughout their careers, combining performance enhancement with credential creation in an increasingly fluid labor market. 

  • Solutions for high-turnover industries that capture and distribute best practices to improve training efficiency and retention (e.g. frontline audio-first interfaces)

  • SMB owner enablement systems in areas like the skilled trades and family medicine, to make it possible to i) capture and transmit tacit knowledge (streamlining hiring/training while increasing terminal value) and ii) help operators compete without needing to bring in expensive consultants or PE expertise

These are, to be clear, highly divergent use cases that necessitate different product philosophies, business models, and competencies from the companies building solutions. However, they share important characteristics, namely that they all represent opportunities to use AI and better data to make professional tacit knowledge, action, and context visible and measurable, unlocking precise intervention to help individuals (and by extension teams and companies) grow into their potential. 

July

Thesis

AI-Enabled Asset Ownership

When to sell to incumbents vs. when to compete.

November 2024

Industry Transformation

For companies deploying AI in legacy industries, a key question is whether to enable incumbents by selling them solutions or compete with them by taking a more full-stack approach. The trade-offs between these two models is something we have started to explore through our AI-enabled services analysis and this piece on when to compete with and when to sell to incumbents in an industry.

Recently, several firms have shared public theses on the opportunity for emerging AI companies (or vertical market software companies) to capture additional value in a given value chain by fully integrating via the acquisition of assets – as opposed to selling solutions to incumbents or taking a more organic (build instead of buy) approach to going full stack.

Slow, which helped finance the $1.6b acquisition of parking operator SP+ by software company Metropolis, calls the model “Growth Buyouts”. Equal Ventures, which recently opined on the opportunity for such a model in insurance calls it “tech-enabled consolidation”. Vertical market software investor Tidemark calls the approach “tech-enabled vertical roll-ups”. Re:Build Manufacturing calls its technology-driven manufacturing roll-up model an “American Kieretsu”. 

Our current hypothesis is that while the AI-enabled acquisition of services businesses (with venture dollars) may not be wise, there is a significant opportunity for emerging AI, software, and robotics companies to capture more value and develop value chain control by acquiring legacy assets in physical industries. 

For decades, venture capitalists have been involved in what seems like Sysyphosian tasks: digitizing businesses that operate assets in the real world. For many reasons, from software shortcomings to incentive problems and structural challenges in GTM to a lack of skilled labor on the consumer side. We see a trend for novel ML models to solve the first of these problems by being able to operate assets end-to-end without much human input. Yet the latter challenges remain. Therefore, AI-native companies addressing these problems are prone to leave value on the table and, due to slow adoption, likely are slower at training and developing their models, preceding a lot of additional value. Therefore, AI-enabled asset ownership represents one path to achieve this. 

Sequence matters for companies that go down this path. Companies should prove they can build technology and deliver ROI (for early customers or via a smaller-scale organic full-stack approach) before embarking on buying distribution via M&A. The only cases where early M&A can be attractive are cases where smaller targets that are structurally very similar to large-scale targets in the market can be acquired for amounts smaller than traditional GTM. Initially, these businesses have venture risk profiles, and only after the second or third large acquisition should they be derisked and become predictable/repeatable enough for investors with a lower cost of capital – Infra, PE, etc. – to consider participating. By reaching this point, venture investors will have seen highly attractive returns. 

Initial Hypothesis on Key Conditions

Below is an initial hypothesis for when it makes sense for a company to vertically integrate via acquisition as opposed to doing so organically or remaining a software/technology vendor to a given industry:

  • The company must have a demonstrated “production advantage,”; i.e., a clear technological or product edge that creates compounding value in an industry. Companies leveraging exclusively off-the-shelf technology likely lack the differentiation to deliver venture-scale outcomes even with strong operational execution and financial engineering. If a PE fund working with Accenture can go after an opportunity or if human labor is cheaper on an efficiency-adjusted basis it is unlikely to be a VC case. If solving the problems requires a combination of world-class technologists AND operators, this becomes an interesting opportunity for venture-style risks and outcomes. 

  • Customers have proven structurally unable to adopt and deploy a company’s solution to its most productive extent. Alternatively, they seem unwilling to pay for the full value of it. This can be due to various reasons, from lack of scale to incentives leading to stasis in a given market (“if my competitor doesn’t innovate, I don’t have to”). We should be able to pinpoint a structural issue – and generally point to evidence from a company’s own experience –  with a given market/customer base to be certain the ineffectiveness is not a product issue. 

  • Building on the previous criteria, companies spearheading the buy-out strategy should be building technology that changes the core way an asset is operated, transforming the economics of the business/industry. Most likely that is where existing operators are (somewhat paradoxly) least incentivized to adopt technological disruption. This is what makes the Metropolis acquisition of SP+ such a compelling application of this approach. SP+ has 3,000+ parking locations around the world where the core customer experience (paying for parking) can be largely automated. While the “work around the work” (maintenance, security, etc.) still requires people, the ROI around the primary transaction is much easier to understand than situations where the AI solution is helping people deliver the primary solution more efficiently (e.g. home services models, legal services, etc.). 

  • Likely, there is a sweet spot around the level of complexity that goes into operating an asset that makes it a fit for AI-enabled acquisition. Complexity can either stem from the core value proposition being complex, several offerings being performed at the same asset leading to compounded complexity or the “work around the work” being significant (e.g., for regulatory reasons). Too little complexity at the core value proposition becomes a PE case; too much and the operational overhead reduces the leverage associated with improving the margins of the core asset. Ideally, the complexity/problems across holdings within the same space should be the same (e.g., parking lots), and skills easily transferable. We should be able to pinpoint these levels of complexity and identify assets/problems where they meet the sweet spot. 

  • The category a company is operating needs to have acquisition targets that are operating at scale (ideally businesses worth USD 1B+ with additional value creation in the several hundred million – further analysis on this needed). Buying assets operating at scale that can be fully optimized and automated via AI is substantially more attractive than rolling up locally-delivered services businesses. Again, this is what makes the SP+ acquisition so attractive, SP+ has 3,000+ parking locations around the world that likely are all run very similarly. Ideally, solutions deliver not only cost savings but also growth opportunities. We are also interested in companies with a view on how the integration of software and legacy assets will unlock increasing ecosystem control and turn the business into an industry operating system. 

  • Companies must have advantaged access to talent across functions. It is rare for a founder or founding team to understand “what great looks like” in areas where they have not had direct experience. A team of software engineers is likely unfamiliar with what makes a great industrial CFO or service-business COO. As a result, we may expect the pool of founders well-equipped to build such a business to be small. We have seen this play out at companies like KoBold Metals, which combine highly scientific founding teams with business acumen. 

These criteria still don’t fully answer why/when it is better to grow a full stack solution via acquisition rather than a more organic approach. One primary reason a company would choose to grow via acquisition is if the geographic footprint and surrounding “infrastructure” of an industry will look largely similar in the future as it does today. In such cases, the speed of distribution created by acquisition is enough of an advantage to overcome the accompanying cultural and organizational complexity that could be mitigated with a more organic strategy.

To use the Metropolis example, should we expect the footprint of the parking industry to be more or less the same in 10 years as it is today? While autonomous vehicles may make some impact on the margin during that time period, the inertia of the built environment probably means we should expect the flow of traffic and parking to remain relatively the same (airports, stadiums, commercial centers, etc). 

A counter-example is H2 Green Steel, which has raised multi-$B to produce steel with 95% lower emissions than traditional steelmaking. Because of the fact that the company’s steel production depended on access to ample clean energy, the company couldn’t just acquire and transform underperforming steel facilities despite the similarity in equipment, permitting, and human capital needs. Thus, to transform the industry around their vision, the company was forced to build a more organic approach. 

Companies also might pursue a buy instead of build strategy when the technology can be easily integrated with existing assets and infrastructure, substantially reducing time to value for a given solution. 

There are likely several other criteria in support of (and against) the strategy of vertically integrating via acquisition which need to be explored in further depth. 

July

Thesis

European Public Safety Primes

With war on its doorstep and American attention shifting to the Pacific, Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century.

November 2024

Industry Transformation

With war on its doorstep and American attention shifting to the Pacific, Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century. But the threat to Europe’s way of life and future prosperity goes beyond front-line kinetic conflict. 

As the threat environment converges, the why now case for companies building for public safety and security in Europe, for Europe, gets stronger by the day. Migration pressures, cyber threats, and cross-border crime require capabilities that existing systems simply cannot deliver. Europe must invest more proactively in innovation across other public safety and security pillars: law enforcement, fire and disaster response, and intelligence.

Across markets, AI is driving a step change in our ability to understand and act upon the physical world. The convergence of AI with real-world data – cameras, drones, satellite imagery, and other sensory inputs – makes it possible to build an intelligence layer that processes complexity at an unprecedented scale. This is opening new frontiers in public safety and security. Companies that can harness, integrate, and analyze this explosion of data to drive faster, safer, and more accurate decisions stand to become category champions and play a key role in forming the foundations for Europe’s long-term growth and strategic autonomy. 

Across the world, advanced policing systems are delivering for forward-thinking law enforcement and border control agencies. Solutions like Flock solve over 700,000 crimes annually, making departments more efficient, while drones drive faster and safer responses. As resistance to US technology persists, expanding EU internal security budgets and increasing evidence that these systems work will push Europe to seek out homegrown solutions.  

Fire and disaster response, helping mitigate the €77b in annual losses from natural disasters and protect human livelihood, represents another critical opportunity area. New entrants combining predictive modeling of weather and the built environment with proactive intervention capabilities will capture value by closing Europe's critical response gaps.

Finally, intelligence agencies are approaching a breaking point: drowning in multi-modal data (from video to financial transactions) while inter-agency coordination fails. Companies that bridge European fragmentation while navigating privacy mandates will become essential infrastructure, enabling real-time response to physical and digital threats.

We see an opportunity for a new class of European "Public Safety Primes" to establish themselves in the market. The Axon playbook – now a $45b+ company built through methodical expansion by expanding from tasers to body cameras to a comprehensive digital evidence platform – shows what's possible. The company has effectively zero penetration in Europe, and local players like Reveal Media and Zepcom remain subscale. Winners will start narrow with a must-have product, earn trust through measurable impact, and expand across the public safety landscape as system-wide advantages compound.

July

Thesis

Composable Companies

Composable companies fuse enduring core expertise with agile, mission-focused teams to rapidly capture diverse market opportunities and redefine traditional business models.

November 2024

Infrastructure Tech

A new organizational model is emerging: the composable company - organizations that blend permanent infrastructure with fluid product capabilities. At their core, they maintain:

  • Capital and resource allocation expertise

  • Shared technical infrastructure

  • World-class talent

  • Possibly, Strategic customer and distribution relationships

By centralizing these unique capabilities, composable companies can swiftly identify, validate, and scale opportunities across its chosen markets. Around this foundation, teams can be rapidly assembled and reconfigured to pursue specific missions/product opportunities with various time scales.

This model excels in markets where opportunity spaces are in flu, and an organization needs to have flexibility and bandwidth to build out ideas that compound together around a coherent view of the future, but might find their manifestation in distinct products for distinct customers.

Recent developments in AI further enhance this model's viability by enabling more cost-effective creation of software and supporting customization for specific use cases:

  • Reducing software development costs

  • Streamlining maintenance requirements

  • Improving customer support efficiency

  • Enabling more cost-effective creation of AI tools

The Resulting Structure

The end product could be a holding company-style enterprise that combines:

  • The above-described core infrastructure

  • Multiple AI product and tools with varying scale and durability

This structure enables the efficient pursuit of numerous opportunities while maintaining the potential for further asymmetric returns from breakthrough successes among them or in aggre


July

Thesis

Marketplaces for AI-Enabled Services

AI-powered, asset-light platforms now empower creators and knowledge workers to build profitable one-person companies that disrupt traditional firms and democratize professional services.

October 2024

Infrastructure Tech

The Rise of One-Person Companies

The unbundling of the firm has been in flight for decades. As the internet enabled increased access to global labor markets, outsourcing to lower-cost countries exploded. The proliferation of cloud computing and mobile took this a step further, making it possible to externalize an increasing number of key operational functions and allowing for more asset-light business models. This prompted a thesis several years ago that the rise of “One Person Companies” remained an underrated opportunity. 

The next step in the evolution of the firm will build on this but will come at the problem from a different direction. It will be defined by the rise of One-Person Companies. Creators and knowledge workers will access external services that provide the capabilities to start and scale a firm and then re-bundle them in unique ways around their core skill set. They will monetize by selling products, services, and expertise to an owned audience that their core skill set has helped them build.

New platforms and infrastructure providers will emerge to support the tens of millions of individuals capable of building successful One-Person Companies along with the billions of consumers and businesses that will support them. More generally, the rise of the One Person Companies will inject dynamism into the broader economy and will play a role in driving more inclusive innovation.

AI – particularly agentic solutions capable of proactively understanding and executing end-to-end business workflows – represents the next leap in this evolution. As several investors and operators have observed, AI is empowering small groups more than ever before and new businesses across the economy (i.e. not just tech startups) are building from inception with AI literacy as a core competency. According to Gusto, roughly a fifth of businesses created last year said they were using generative A.I. to more efficiently carry out things like market research, contract reviews, bookkeeping, and job postings.

Current and Future Market Structure

In complex, non-commodity service categories like strategy consulting, law, investment banking, and wealth management – where key individuals inside of large companies often already “run their own book” – we believe these forces create the opportunity for further fragmentation; i.e the “creator economy-ization” of professional services.

A study cited in a 2015 Forbes article about million-dollar solo consulting businesses indicates this opportunity is not new. 

The U.S. Census Bureau found that of 3.2 million "nonemployer" professional services businesses in the U.S., there were 7,276 firms that brought in $1 million to 2.49 million in revenue in 2013, the most recent year for which statistics were available. And 321 superstars brought in $2.5 million to $4.99 million.

For the sake of simplicity throughout the document, we will refer to these companies as Service OPCs, though there is of course no reason why it must be a single person.

In practical terms, we believe we are entering a period where an even larger number of individuals or small teams with a differentiated skill set or value creation mechanism (product) will increasingly be able to leverage the marketplace (instead of “the firm) for distribution and operational capacity to build profitable and durable OPCs.

This thesis rests largely on the idea that some elements of human judgment are inherently non-scaleable / automatable (similar to our thesis around where value is captured in AI-driven content creation) and thus that the dynamics of the market will tend more towards specialization – thousands of small, profitable “winners” – rather than winner-take-all. 

A services Mittelstand rather than Silicon Valley concentration.

We are interested in what the agentic technologies that drive end-to-end workflow execution will look like and what the coordination mechanism across those autonomous services will be for Service OPCs. Without both of these things becoming a reality in parallel, the overhead of identifying and managing dozens of end-to-end AI agents (some of which will inherently be more end-to-end than others) while growing a client base and playing the most important role of delivering the differentiated service (even if some elements are made more efficient through automation) is likely enough to push promising OPCs back into the arms of McKinsey or Kirkland Ellis.

Effectively, we believe there is a Shopify-like opportunity to “arm the rebels” and build an ecosystem-level operating system for the AI-driven services transformation – combatting empire-building incumbents who view AI as a solidifier of their market positioning and what are sure to be dozens of overfunded venture-backed businesses promising to be “the next Goldman Sachs”.

Product and Technical Hypothesis

By engaging at the aggregation and coordination level, we are interested in answering the question of how a platform might “package” a business around an OPC’s core skill set to help it grow beyond its pre-AI agent potential. 

While we want to avoid being overly prescriptive in our analysis at such an early stage, we believe that for such a platform to represent a comprehensive – and attractive – alternative to the firm for Professional Service OPCs, it would possess some or all of the following characteristics (features), listed roughly in order to how they might be sequenced from a product perspective:

1. Functional Automation (Operational Capacity) – This pillar would serve as an "Agent Store," featuring both proprietary workflows and third-party end-to-end AI agent solutions. It would offer OPCs end-to-end functional agents for various business operations, such as:

  • Contract management

  • Financial management and forecasting

  • Compliance and risk assessment

  • Resource allocation and project management

  • Continuous learning and skill development

  • Marketing and public relations

  • Legal execution

It is also interesting to consider how such a store could provide a distribution channel for third-party developers of specialized AI solutions like Devin (for software development) or Harvey (for legal services) or the seemingly dozens of AI agent companies launching each week (a quick scan of the most recent YC class highlights how prevalent this model has become for early stage companies. 

These developers would be incentivized to use the platform due to its focus on going beyond simply offering access to agents but, helping OPCs “package” a specific set of agents around the skills and ambitions of the company, which brings us to the next pillar of the platform. 

2. Organizational Coordination (The AI COO) – The AI COO acts as the central nervous system of the OPC, ensuring all parts of the business work together seamlessly. Key functionalities include:

  • Automated integration between functional agents (the Bezos API Mandate on overdrive)

  • Workflow optimization across all business functions

  • Stakeholder communication management

  • Strategic decision support

  • Continuous improvement engine for business processes (i.e. vetting and implementing improved solutions or workflows autonomously). 

This pillar is critical in attracting and retaining both OPCs and third-party AI solution providers. For OPCs, it offers unprecedented operational efficiency and is the core enabler of leaving the firm behind for good. For AI solution developers, it ensures their tools are integrated effectively into the OPC's operations, maximizing utility and, long-term revenue potential.

With these three pillars working together, such a platform aims to create a robust ecosystem that not only supports individual OPCs but also fosters a network of AI solution providers. This symbiotic relationship between OPCs, the platform, and AI developers has the potential to drive rapid innovation cycles and expand the market in the same way Shopify has done in e-commerce for physical goods. 

Antithesis

While we have a reasonable degree of confidence that the end state of the unbundling of the firm will look something like what we have laid out above (“Shopify for professional services” is likely a primitive analogy for what we will have in 2050), there are several reasons to be wary of the thesis. Much of this hinges on market timing as well as the common question of whether this will enable truly novel business models to emerge that incumbents are structurally unable to compete with.

  • We may be underestimating incumbent entrenchment, particularly around trust and social signaling, and their ability to adapt. “Nobody got fired for hiring McKinsey, Goldman, etc.”. While not apparently on the operational level (yet), incumbent consulting firms have been among the biggest beneficiaries of the generative AI explosion. 

  • Regulatory, compliance, and legal structures may change slower than the technology,  Sectors like law and finance are heavily regulated. OPCs might face significant hurdles in meeting compliance requirements without the resources and infrastructure of larger firms, potentially limiting their ability to operate in certain (high-value) areas.

  • The complexity of integration (i.e. the AI COO) may be substantially more complex than we have described. The reality of seamlessly integrating multiple complex AI systems could be far more challenging and error-prone than expected, leading to inefficiencies or significant mistakes.

July

Thesis

Benchmarking the Physical World

Standards are the hidden force behind market efficiency, capital formation, and global trade.

October 2024

Infrastructure Tech

Standards are the hidden force behind market efficiency, capital formation, and global trade. From the meter to the shipping container, standards create the shared layer of trust that helps markets function and capital flow.

In 1860, at the outset of America’s railroad frenzy, Henry Varnum Poor published “History of Railroads and Canals in the United States”. This work was the first attempt to arm investors with data on the burgeoning industry and laid the foundations for what is now Standard & Poors — a $100b+ company with $11b in annual revenue. Alongside its long-lived triopoly counterparts, Moody’s and Fitch, it has persisted thanks to powerful standard-based moats that make their frameworks critical infrastructure for global capital markets.

“We think of ourselves as a benchmark company. I mean, data is in vogue now, and people are really kind of a bit obsessed with data and data companies… I think data is nice, it’s interesting. But if you could turn something into a benchmark, it really transcends data.”
SVP at S&P Global, November 2020

As Mark Rubenstein wrote in “The Business of Benchmarking”, universal standards are usually unassailable. The risk for companies that manufacture them is less that their moat is crossed and more that their castle becomes irrelevant. We believe the current geopolitical, economic, and technological moment is creating a once-in-a-generation opportunity to successfully counterposition and eventually (finally!) displace the global ratings and benchmarking oligopoly.

Several forces are converging to create this opportunity. First, Great Power Competition is fundamentally reshaping global trade and industrial activity. The push for energy independence, secure supply chains, and strategic autonomy is driving massive investments in decarbonization and reindustrialization. Reconfigured trade flows and industrial priorities demand new frameworks for understanding risk and opportunities. Second, the growth of sensor networks, connected devices, and geospatial systems has created unprecedented visibility into physical world operations and trade flows. This proliferation of data  – from factory floors to shipping lanes – provides granular, real-time insights that were previously impossible to capture. Finally, advances in AI and machine learning allow us to process and derive meaning from complex, multi-modal data at the scale and speed demanded of modern trade. 

We've seen the fundamental transformation of key commodity markets firsthand through our investment in KoBold Metals. Better collection and analysis of physical world data is revolutionizing resource discovery and development. Meanwhile, geopolitical machinations are accelerating the reconfiguration of global supply chains and trade routes, creating urgent demand for new frameworks to understand and price physical world assets. Traditional frameworks – built for a different era of global trade – are increasingly misaligned with markets that require real-time, granular insights to make decisions.

Success in this market isn't about attacking the incumbent oligopoly directly. Through counterpositioning, the opportunity lies in building for the new industrial economy with a model native to the speed and complexity of modern trade. Winners will start narrow, building density of data and trust in specific verticals, before sequencing alongside their customers' evolving needs to develop new pricing and risk infrastructure for the physical economy.

July

Thesis

Nuclear Supply Chain

Technology deployment cycles – from the railroads to the internet – have long been characterized by dramatic boom and bust cycles. The nuclear cycle is picking up pace.

September 2024

Industry Transformation

Working Thesis

Technology deployment cycles – from the railroads to the internet – have long been characterized by dramatic boom and bust cycles. And while overreliance on analogies is dangerous, these historical precedents provide one lens through which we can contextualize the current moment. 

Today, the AI boom is drawing parallels to the 1990s dot-com and telecom bubbles, with massive capital expenditures, promises of exponential growth, and early productivity gains. On the horizon, the potential of general-purpose robotics even resembles the iPhone-driven mobile revolution that followed almost a decade after the dot-com bust. 

But the differences between the two eras are equally striking. Today, incumbent technology companies possess more structural power over the ecosystem than 30 years ago, suggesting perhaps less overall volatility at the expense of dynamism – i.e. The Magnificant Seven serve as “hysteria dampeners” thanks to massive balance sheets and distribution networks. And while opportunism in AI is certainly present, the technical barriers to entry needed to build a competitive foundation model (and the pace of customer feedback) are substantially higher than building an ISP during the DotCom frenzy.

However, the most consequential difference between the two eras may be the central role of energy – namely the re-emergence of nuclear power – in today's AI boom, particularly with the backdrop of rising great power competition and the ever-present specter of climate change.

Unlike the telecom infrastructure of the dot-com period (and data centers in today's cycle), which serve singular purposes, the expansion of nuclear infrastructure addresses multiple critical challenges. First, it promises to play a significant role in solving for the energy intensity and reliability demands of AI data centers. This is a problem we are looking at from several angles – nuclear and other forms of production, efficiency (via our industrial action layer research), and finally via an exploration of better distribution and grid resilience technologies.  

Beyond serving AI data centers, nuclear power (along with the other categories highlighted), meets the vast need for clean baseload power to accelerate decarbonization and the push for increased energy security amidst heightened geopolitical risk. 

As a result, nuclear’s positioning as a one-size-fits-all solution to many of our most pressing concerns – and thus its (theoretical) resilience to the fortunes of any single macro factor – makes it an attractive “picks and shovels” opportunity perfectly in line with the three major economic supercycles of the next several decades (AI, climate, and national security) – provided the current momentum can overcome generations of cultural and political baggage.

This baggage is complex, equal parts social, economic, and regulatory with each reinforcing the other in a vicious cycle. High-profile accidents and proliferation risk have dominated the social consciousness for 40+ years. This, in turn, influences regulation which increases the red tape related to the development and deployment of safer, more effective modern solutions. As process knowledge is lost and talent leaves the industry, costs spiral even higher and we are left with the current state of affairs. 

Despite its history as a safe, clean, and cost-effective technology, nuclear costs have jumped 33% higher while the cost of new solar generation continues to plummet (over 90%). The latter is, to be clear, a positive – we are pro-energy abundance more than we are wedded to a single approach – and showcases the power of technological learning rates if unleashed. 

Current and Future Market Structure 

Today, the narrative – and the fundamentals – around advanced nuclear are finally shifting towards the positive across several dimensions.

  • Russia’s full-scale invasion of Ukraine provided a clear focal point for the importance of energy security and the role nuclear energy can play in decoupling from Russia. Between 2021 and 2022, the percentage of Europeans at least somewhat in favor of nuclear energy jumped substantially – in Germany from 29% to 44%, in France from 45% to 55%, and in the UK from 39% to 53%.

  • Energy has become the core bottleneck to scaling AI. While GPU scarcity dominated the preceding couple of years, everyone from Mark Zuckerberg to Sam Altman believes the next substantial step forward requires energy-related breakthroughs. As a result, big tech companies have become the “urgent buyers” necessary to drive market transformation. Microsoft’s actions signal a clear belief in the long-term need to expand nuclear capacity. Its recent 20-year power purchase agreement with Constellation, which will revive the Three Mile Island facility, is as symbolically important as it is economically.   

  • The capital markets are responding swiftly to this step change in demand, with financial institutions including BofA, Morgan Stanley, and Goldman backing a COP28 goal of tripling nuclear capacity by 2050. The commitment to triple capacity also came with support from the US, UK, Japan, Sweden, and the United Arab Emirates.

  • Regulatory support has not been limited to COP commitments. In the United States, for example, President Biden recently signed into law the ADVANCE Act, which aims to streamline licensing, promote R&D investment, and contribute to workforce development.

  • This follows on the heels of (tepid) progress on the deployment front. In the United States, Vogtle 3 and 4 are finally online, years late and billions over budget. Still, the finalized designs, established supply chains, and trained workforce should contribute to less expensive future deployment. This summer, TerraPower began construction on its advanced reactor facility in Wyoming. Meanwhile, SMR certification is building momentum in France as companies like NewCleo and Jimmy Energy look to deploy first-of-a-kind reactors by 2030.  

  • Finally, the characteristics of SMR and AMRs coupled with the aforementioned demand dynamics have ignited venture interest in the space. Smaller form factors that can be deployed more flexibly and with a broader range of financing options have eased some concerns about the venture scalability of such projects. As a result, dozens of companies have been funded in recent years. Today, over 100 SMR and AMR designs are being developed at various stages and with different timelines across the world. 

Key Early Assumptions / Potential Catalysts 

The improved environment around nuclear power leads us to a few critical questions, based on important assumptions about where scarcity and abundance sit in the nuclear value chain.

  • Assumption 1 → The timelines for most advanced nuclear projects are at least 7+ years out, likely longer if history is a guide. This may not align with our time horizon unless we can identify intermediate value inflection steps that create opportunities for exit, etc. similar to the life sciences ramp-up.

  • Assumption 2 → The crowded nature of SMR/AMR technology development (abundant capital and attention at that part of the nuclear value chain) and the lack of a clear set of winners should push us to areas of relative scarcity where solving key bottlenecks can accelerate the market overall (i.e. turning those hundreds of companies into customers immediately).

So where is there scarcity in the market that aligns with our investment horizon and approach? Three areas stand out initially, with each meriting a separate deeper dive should conversations with experts, operators, and founders push us in a certain direction. 

Fuel → Russian (and increasingly Chinese) dominance of key fuels and processing steps risks putting the West in a vulnerable position, which echoes the overreliance on cheap gas that contributed to the Ukraine invasion and the European energy crisis. Significant efforts are underway to de-risk the nuclear fuel supply chain. The US Congress passed a bill to limit the import of Russian uranium, while the “Saparro 5” (Canada, Japan, France, the UK, and the US) announced plans to jointly invest over $4b to boost enrichment and conversion capacity over the next three years. 

The biggest risk to decoupling from Russia has been HALEU (high-assay low-enriched uranium), which advanced reactors are being developed to run. Until the Russian invasion of Ukraine, Russia was the only place with a plant licensed to produce this material. Companies like Centrus Energy and a still-under-the-radar Founders Fund-backed startup are targeting this bottleneck, which could be an important enabler of the broader advanced nuclear space. 

Project Development → Over the last two years, much of my work has centered on how to best help “emerging industrial” companies scale more effectively. While my early assumptions were largely centered on the need for new financing models, the critical bottleneck to deploying new projects across energy, materials, and manufacturing often turned out to be capable of on-the-ground project development and execution. Given the deterioration of the nuclear talent base across most Western countries, this problem is even more severe. 

A key problem with effective (i.e. on time, on budget) project development is the fragmentation of the subcontractors needed to build a project end-to-end. Companies are aiming to solve this through a reactor-agnostic platform for nuclear project execution. Through a comprehensive oversight strategy, which includes taking direct control over supply chain management, sourcing, workforce coordination, and financing required for constructing power plant fleets the company hopes to do for nuclear what SpaceX did for launch. Others are building fully modular, factory-made systems innovating on the delivery and development model rather than at the reactor level. 

Waste → Waste remains perhaps the most politically and socially charged aspect of nuclear energy, leading to decades of information warfare despite the relatively boring nature of the problem. Historically, the commercial incentives to store waste weren’t particularly attractive, making it a largely political endeavor. 

Today, countries and companies around the world are starting to see opportunities to turn waste from a burden into an opportunity through recycling.

July

Thesis

AI-enabled Services

We see an interesting opportunity in the context of end-to-end AI service providers. 

September 2024

Industry Transformation

We see an interesting opportunity in the context of end-to-end AI service providers. 

We believe that in certain cases, AI sold as as SaaS product can neither unlock its full potential nor allow developers to capture the value they are creating. This has a few reasons:

  • The limited reliability and lack of guaranteed perfect performance of AI models have led to their positioning as co-pilots rather than end-to-end task performers. A few use cases aside (e.g., coding), we don’t see such end-to-end task performers coming to market anytime soon. This means that successful deployment depends on adoption by a customer’s workforce. Naturally, this makes the ROI of these systems hard to measure and is paired with a sentiment shift that the productivity increases associated with those systems might have been overhyped. The fact that an intangible ROI is running against a very tangible cost of inference for developers does not make this any easier.

  • In a co-pilot world, breaking out becomes even harder for emerging companies. They have a structural disadvantage over established companies that can easily develop and distribute a co-pilot to their existing customers as part of their platforms. This is especially tragic for emerging companies because they require feedback loops and data to improve their algorithms. Without this, they inevitably fall behind the competition in terms of technical capabilities.

  • Pricing models that work in the context of software (e.g., seat-based) don't work in the context of AI, as the focus is often on productivity gains (i.e., getting more done with fewer seats). Therefore, there is a need for value-based pricing.

As a result, we see an interesting opportunity in the context of end-to-end AI service providers. These companies focus on one specific job and guarantee the successful execution and delivery of that job. The way these businesses will look internally is that they will utilize AI as much as possible but have a high-quality domain expert who can jump in if there are issues to ensure successful job delivery. Over time, these companies accumulate substantial proprietary data from “end-to-end” feedback loops of delivering jobs. This holistic approach puts these companies in a unique position to develop best-in-class models for a specific use case, leading to increased automation. In the long-term, the COGS of these businesses will converge toward the cost of computing.

In a lot of industries, professionals are either already using extremely sticky software or the software as it is offered doesn’t fit in the specific workflows (it is reasonable to assume that capitalism has led to every software niche being explored in the past 15 or so years). As mentioned above, many of the companies that have successfully acquired users at scale are already planning to roll out co-pilots as features of their products. For an AI-native company to build out a software suite and spend substantial amounts on customer acquisition is likely not the best use of resources.

This is circumvented by offering the delivery of an entire job and delivering results that are compatible with the legacy software customers might be using. Over time, these companies might decide to build out or buy software platforms on top of their AI-enabled service, interlocking themselves with their customer's processes and generating meaningful lock-in (Nvidia software stack, Marvell IP portfolio).

In the cases where conditions for AI-enabled service to emerge exist (see criteria below), we see this as having potentially three effects on market structure:

  1. Consolidation: Some industries may see a consolidation where AI enables a few large players to dominate by integrating and scaling more effectively than before.

  2. Maintained Concentration: In other industries, concentration may remain the same, but new AI-enabled companies could compete head-to-head with existing large players, potentially reaching similar sizes and profitability.

  3. Fragmentation: Certain industries might experience fragmentation, where AI enables many smaller players to operate independently. This could lead to a need for new platforms or aggregators to organize and leverage these fragmented services.

We think the most interesting venture cases will emerge in the context of 1. consolidation and 2. maintained concentration. In the context of 3, it is interesting to explore the next-order effect of this, (see Marketplace for AI-enabled Services and Products)

Independent of the structural market outcomes, the occurrence of AI-enabled service businesses requires specific conditions to emerge and thrive. We differentiate two types of requirements.

First, necessary conditions are always necessary for these businesses to pursue opportunities. Many of these opportunities, however, may become commoditized, leading to numerous profitable but modestly sized businesses, typically in the $10 million revenue range (i.e., fragmentation).

Therefore, for market outcomes 1. and 2. and venture-scale outcomes to occur, opportunities must have the potential for significant competitive advantages, or "moats." These moat conditions are likely present in only a small subset of AI-enabled opportunities.

Our primary objective is to identify and focus on the most promising categories where both the necessary and moat conditions are met. These categories represent the most attractive opportunities for substantial growth and success in the AI-enabled service sector.

Necessary Conditions

  • Objectifiable Outcomes: Objectifiability in outcomes is crucial to 1) training the models and 2) lowering transaction costs with customers. 

  • Easy to Hand Off from/to customer: Easy hand-off is critical to lower transaction costs and make sure the business can scale without integration work, etc. 

  • Technology Maturity: Technology utilized to deliver services needs to be sufficiently mature, or there needs to be a clear path for technology to be mature. In the beginning, human labor supporting the delivery of services is fine, but there needs to be a clear path to attractive unit economics with close to 90% automation. 

  • Value-based Pricing Possible: The thesis depends on a company's participation in the upside of what is 1) generated at a lower cost or 2) made possible due to a new service. It is critical for the economics that the service provider can sufficiently capture the value generated to ensure top-notch margins that are improving as the technology matures. 

Moat Conditions (at least one or two of them need to be true)

  • Enabling Technology Moats: Using off-the-shelf LLMs will not lead to sustained technology moats. Technology moats will emerge where the quality of the service offered relies on developing and integrating with other types of ML / AI, which will lead to some initial barriers to entry from a technical perspective.

  • Improvements in Technology from Feedback Loops: Building on the previous point, ​​another source of possible moat is that technology improves through feedback loops, as services are delivered to customers and lessons are learned. This means that the market leader can improve its technology fastest, leading to winner-takes-most situations. 

  • Sustained Unique Customer Access: Efficient customer acquisition for a lot of these businesses will be key to delivering top-notch profitability margins in the long term. Those categories where certain types of companies have unique access to customers (e.g., due to industry pedigree) are likely to be attractive. Especially when paired with the technology feedback loop condition outlined. 

  • Liability Underwriting: The assumption is that many of these service businesses will have to take liability for correctly delivering these services. Suppose liability is a crucial aspect of the offering. In that case, the amount of risk that can be taken in this context is a function of the cash a company has on its balance sheet to serve as a buffer for potential failures and can, therefore, be more aggressive. 

  • Regulatory Moat: Benefit from licensure requirements and other regulatory hurdles, which act as a natural barrier to entry and provide a stamp of trust and credibility. However, it is unclear whether this is actually the case. Lawyers require licenses, but barriers to entry are fairly low and based on academic performance. If the underlying models are similar, won’t everybody or nobody get approved? 

  • Brand / Trust: Services businesses inherently are built on a high degree of trust and brand recognition. This trust and brand enable customers to know that somebody can be held accountable and their choice isn't questioned by their bosses or investors (e.g., hiring McK, BCG, or top 4). It is likely that the same dynamics play out here and that this can be a source of sustained competitive advantage.

July

Thesis

AEC Design Tooling

When will we see Figma for the build world?

February 2025

Industry Transformation

Autodesk is a $65B company with 90% gross margins and earnings growth of 10%+ annually over the past decade. It is, in the views of many practitioners in the ecosystem, a monopoly in the worst sense of the word – extractive price increases paired with degrading product quality, closed and proprietary standards that lock in customers, and a lack of feature-level evolution to meet the needs of architects, engineers, designers, and contractors.

But Autodesk is just a symptom of a deeper problem. The entire AEC technology stack has evolved to reinforce silos rather than break them down. Each specialty has its own tools, workflows, and data formats, creating artificial barriers between disciplines that naturally need to collaborate. The result is an industry that remains extremely inefficient – construction productivity has historically grown at under 1% annually despite billions spent on software.

Perhaps counterintuitively because of the stranglehold Autodesk (and other deeply ingrained products) holds, our early hypothesis is that focusing on design is the perfect wedge to transform this massive industry. Every project's design phase naturally brings together architects, engineers, contractors, and owners – each making decisions that cascade through the entire project lifecycle. This, in turn, creates the possibility to develop network effects (the type enjoyed by Autodesk) once at scale.

The question, then, is what creates the wedge for companies to build distribution in the first place and kick off the network effects flywheel – something that has been a challenge for new entrants, as evidenced by the lack of massive venture-backed outcomes to date. We believe several converging technologies are coming together to massively reduce the costs of experimentation, lower the barriers to real-time design collaboration between parties (minimizing the cascading delays that begin at the design phase), and expand the creative canvas of design possibilities.

  • WebGL and cloud-native architectures finally enable true browser-based 3D modeling at scale. Just as Figma used these technologies to make design collaborative, new platforms are rebuilding BIM from first principles for seamless multi-user collaboration

  • Advances in physics-based simulation and generative AI allow instant validation of design decisions - emerging tools can compress structural engineering workflows from weeks to minutes and automatically optimize building systems for performance.

  • New platforms are bridging design and construction by translating BIM models directly into fabrication instructions, creating the potential to significantly reduce MEP installation costs.

We see three approaches emerging to leverage these technologies and begin embedding them into multi-stakeholder workflows:

  1. Next-gen cloud BIM platforms (e.g., Motif, Arcol): Browser-first collaborative design tools – i.e. "Figma for buildings". Here, we believe companies can build momentum through counter positioning – API-first models that prioritize open document and data standards.

  2. AI-powered point solutions (e.g., Genia, Qbiq): Focused tools that dramatically accelerate specific workflows. Genia automates structural engineering analysis and optimization, while Qbiq uses AI to generate space planning options for real estate teams.

  3. Design-to-fabrication platforms (e.g., Stratus): Bridging the gap between design and construction by automatically translating BIM models into fabrication-ready instructions. Stratus has shown particular success in MEP, where better coordination can significantly reduce installation costs.

The path to end-to-end orchestration will follow a clear sequence: Start by connecting architects and engineers through real-time design collaboration. Then extend to contractors, automatically translating designs into construction planning. As the platform becomes the system of record for design and planning decisions, it can naturally expand into procurement, payments, and project financing - using its unique data position to reduce risk and unlock better financial products. Eventually, these platforms could have a shot at orchestrating the entire building lifecycle - from initial concept through operations and maintenance.

Most importantly, these platforms will enable fundamental shifts in business models and incentives. Today's hourly billing and fixed-fee structures actually discourage efficiency - architects and engineers are paid for time and deliverables, not outcomes. But platforms that can measure and validate impact could enable new performance-based pricing models. Early adopters might start with simple metrics like design iteration speed or coordination time saved. Over time, as platforms gather more data across the building lifecycle, they could facilitate true outcome-based contracts where designers and engineers share in the value they create through better, faster, more efficient projects.


July

Thesis

Geospatial Intelligence

The complexity of understanding and managing our physical world is increasing exponentially.

January 2025

Infrastructure Tech

Why is this important?

The complexity of understanding and managing our physical world is increasing exponentially. Climate change creates both acute (e.g. wildfires) and chronic stress on our (aging) physical infrastructure. Supply chains are becoming more intricate and, in the face of geopolitical tensions and the energy transition, are reconfiguring on a global basis in real time. 

Geospatial intelligence – novel physical world data captured via optical, multi-spectral, hyperspectral, and other advanced sensor systems via satellites, ground stations, and other modalities – represents a critical substrate for building the advanced action layers (e.g. truly comprehensive world models) that will power fundamental industry transformation in areas like mining, energy, agriculture, and defense. 

However, the trajectory of the geospatial intelligence market has largely been a story of massive perceived potential and disappointing results for builders, investors, and customers. While the use cases have been evident for decades, commercial value at scale has been slow to materialize and the net ROI of most earth observation companies has likely been negative. Adoption has been broad, but shallow – few commercial customers spend more than $1m per year on data and value-added services related to geospatial intelligence. Leaders on the upstream data collection part of the value chain (like Airbus and Maxar) still rely on government customers for a majority of their business while companies like Planet Labs still struggle to project commercial demand from quarter to quarter, indicating a lack of urgency to the data and analysis being offered. 

Solving the bottlenecks around geospatial intelligence that have kept deep commercial adoption out of reach – namely expensive data acquisition costs (for high fidelity data), fragmented data accessibility, and a lack of connectivity from data to core enterprise/industrial workflows has substantial implications for economic growth and human flourishing. The World Economic Forum projects that geospatial intelligence, as a platform technology, has the potential to drive more than $700 billion in economic value annually by 2030. A vast majority of this value will be created in critical physical industries – transforming land use, mitigating natural disasters, transforming how we build and maintain infrastructure, reducing greenhouse gasses, and addressing security and safety issues more proactively. 

Why is this interesting? 

We believe these bottlenecks are finally beginning to fall thanks to two converging factors – technological step-changes and the emergence of urgent buyers for the key technological building blocks that will make cheap, precise, and actionable geospatial data possible. 

  • Launch costs have fallen 80-90%, enabling massive sensor deployment. While it took nearly 60 years to put 2,000 satellites in orbit, we launched 3,000 in 2023 alone

  • Next-generation sensors are achieving unprecedented coverage and precision. Today's systems can detect not just the presence of objects but their composition and behavior from hundreds of kilometers away, at sub-meter resolution

  • AI and compute advances have collapsed processing times and made it possible to for non-specialists to make sense of multi-modal data – what took human analysts years now often takes minutes

The demand side pull, while still not fully materialized, is equally as important and developing quickly:

  • Insurance companies – and the entire insurance model – face existential pressure from climate-driven catastrophic losses (and regulatory intervention). Beyond risk assessment capabilities, improved, more transparent/accessible tooling can help to rebuild trust in this important segment of the financial system. 

  • Autonomous systems (and with it shorter decision-making windows) are increasingly factoring into defense and intelligence operations, putting a premium on breaking down the current data silos to develop advantaged (precise and real-time) sensemaking capabilities.

  • As we have observed through KoBold, the energy transition is creating entirely new customer segments (and forcing agility from large incumbents) focused on critical mineral discovery, methane detection, and other resource categories like water or forestry. 

  • Infrastructure operators, utilities, and construction firms are scrambling to maintain the trillions of dollars of assets needed to reindustrialize, electrify, and – more critically – simply keep the modern way of life (e.g. clean water) running. Proposed initiatives like The Stargate Project create another major tailwind for the geospatial intelligence market. Above are just the handful of use cases we have been most exposed to through our investments and research. Like most great platform technologies, though, we believe many of the most valuable applications will be emergent. Thus, as we look at investments in the category, we are most interested in companies positioned to surf rather than compete with the dual blast radii of LLMs and Space Launch.

Which areas are most investible? 

Sensor Advantage / Infrastructure → While much of the sensor stack is being commoditized, competition at the powerful world model level (e.g. Niantic’s LGM) will drive demand for truly differentiated imaging and sensor suites. High precision, platform agnostic, high bandwidth, and real-time hyperspectral imaging stand out.

Data Fusion → As launch (and other sub-orbital geospatial sensor deployment) grows exponentially, data generation will scale along with it. If the status quo holds, silos and the need for bespoke solutions will only worsen. There is a Snowflake-scale opportunity to build data warehousing and piping for multi-modal geospatial data.

Geospatial Data as an Industry Transformation Wedge → Similar to Gecko in robotics, we believe the most valuable geospatial companies won’t be thought of as geospatial companies when all is said and done. Instead, we see major opportunities to use geospatial data as a wedge to build the workflows and intelligence engines that transform physical industries.

July

Thesis

Industrial Energy Efficiency

Energy demand is rising for the first time in over a decade thanks to rapid electrification, reshoring of manufacturing, and perhaps most notably, AI.

January 2025

Industry Transformation

Energy demand is rising for the first time in over a decade thanks to rapid electrification, reshoring of manufacturing, and perhaps most notably, AI. This demand is being driven top-down via policymakers and bottom-up from the private sector. Regulations like the IRA and CHIPs Act have driven significant growth in new manufacturing construction. Meanwhile, energy constraints have overtaken GPU availability as the core bottleneck to scaling AI for companies like Microsoft, Google, Meta, and OpenAI. 

The willingness of big tech companies to spend whatever is necessary to access energy in the name of AI has led to amusing re-estimations of future data center energy demand every few months.

“[Our expectation of] 83GW is up from ~56GW from the prior September 2023 modeling. Overall McKinsey now forecasts US data center energy consumption in terawatt hours (TWh), rising to 606TWh in 2030, representing 12% of total US power demand. Critically, this is up from ~400TWh in the September 2023 modeling refresh. This is relative to 147TWh in 2023 and 4% of overall US power demand.”

Meeting this energy demand, whether in service climate objectives or geopolitical, energy, and technological sovereignty priorities, is of existential concern to economies around the world. As the saying goes, there is no such thing as an energy-poor rich country. Europe, in a trend that has continued since Russia invaded Ukraine, continues to struggle to meet the energy-related needs of its industrial champions. This has pushed them in droves to the US and other geographies, putting the continent’s competitiveness, productivity, and growth at risk. 

Energy abundance generally and response to data center demand specifically hinges on three important pillars: new power production, better transmission and distribution, and more efficient utilization.

As highlighted in other research, owning and operating physical assets can provide companies with a tremendous moat and allow them to capture more of the value they create. For this reason, companies focused on new power generation or physical infrastructure related to better transmission and distribution are interesting. However, such opportunities are often held back by factors like permitting that are outside their immediate control. 

Efficiency, on the other hand, is a problem best addressed by software and AI. This is particularly true for commercial and industrial buildings, which account for ~ 20% of final energy use (and rising thanks to the growth factors highlighted above). In some places, like Ireland, data center use alone promises to consume nearly one-third of grid capacity in the near future. As energy costs become a more substantial profitability factor and increased consumption puts pressure on sustainability objectives, better solutions for commercial and industrial energy efficiency represent one of the biggest opportunities of the next several decades.

Many of these operations have concrete optimization functions with goals and constraints. However, in many cases, the degrees of complexity of the world are too large for humans to grasp. Therefore, we fail to set up appropriate optimization functions and systems around them, leading to systems far from their global optima. That’s where we see massive opportunities for reinforcement learning. Advanced RL has enabled us to address areas previously unfeasible to optimize for due to their levels of complexity. 

Managing the energy usage of highly energy-intensive operations (e.g., data centers, cooling facilities, etc.) fits these criteria. RL models are capable of driving significant performance improvements autonomously, saving substantial energy and cost. The team behind Phaidra, one company that applies these models, was started by a team of Google employees who deployed these methodologies at Google data centers and saw up to 40% energy savings. They recently announced that they could drive energy savings of 16% at Pfizer’s data centers. Meta has published similar efforts. 

One of the key questions is whether there is enough quality data from sensors to support these plans and whether there is enough digitization of the physical world (and of its controls) for models to drive actions in the physical world. It is likely reasonable to assume that digitization has penetrated well enough for us to have a reasonable granularity and actionability, but the assumption is that the more data and the more actionability, the better. 

This field sits at the intersection of two areas that are core to our broader AI theses: 

  1. Massive economic will happen in the physical world.

  2. The most interesting AI use cases are in areas where AI helps us develop an understanding of the world where complexity is so high that we previously could not. In previous writing, we have referred to these as “expansion” use cases. 

Moreover, similar to KoBold, we expect that building a company in this space will require hiring world-class people across various fields: 1) AI/ML, 2) Software, and 3) Optimization of niche systems. We believe that for companies, being able to build a company that combines these three talent sources will build up substantial talent moats.

July

Thesis

Spontaneous Software

As LLMs can create software cheaply and agents become skilled at connecting user experiences in novel ways, companies are starting to push ideas around self-assembling/spontaneous software.

January 2025

Fundamental Consumer

As LLMs can create software cheaply and agents become skilled at connecting user experiences in novel ways, companies are starting to push ideas around self-assembling/spontaneous software. We believe that, enabled by LLMs, a new paradigm could be on the horizon that increasingly merges the creation and consumption of software and makes a longstanding vision a reality.

We have previously written about this in the context of business software (see here), but we see an equally interesting opportunity in pro/consumer software and applications. It is important to stress that this is an incredibly nascent area with more questions than answers. 

The few questions we have are: 

  1. Where does this happen? For these experiences to feel genuinely magical and software to feel spontaneous, LLMs must have the maximum context of a user's digital expertise, data, and usage patterns across applications. The most likely place for this to live is within a user’s operating system. Assuming operating systems are too slow to adopt, apps will likely emerge. However, it is unclear how long their staying power will be and how useful these will be if the tools and experience they create/enable are removed and not interconnected with default operating systems. In that case, the default place where these things live could be the interfaces of the large LLM providers. Claude has taken steps in that direction. 

  2. How do these systems' privacy mechanisms work? As described above, they require a lot of context to feel magical. The question is how this context is handled privately. Some approaches mitigate risk, such as private cloud enclaves, but there could be a world where these kinds of applications only start taking off when models have 1) memory and 2) can run on consumer devices (e.g., phones and PCs).

  3. What do monetization and business models look like here? It is unclear how much users will pay for custom software tools, especially if this requires work/creating tools. Only 30% of Android users customize their OS, and the current app paradigm has not trained people that they need to pay for utility-type services (this is the result of a combination of tools as a way to lock in as well as ad-supported models). In a world where apps become cheaper to produce and likely more abundant (due to the same dynamics discussed here), it is unclear whether most users will not just use apps that are increasingly available for niche use cases until software becomes completely self-assembling, assuming users every intent ahead of time. 

If we find good answers to these questions, we will be excited about this space and its potential.  

July

World View

Digital Antidote

Touching grass.

January 2025

Fundamental Consumer

As culture becomes more homogenous and consumption more solitary (a conundrum we wrote about in Airspace and Bubbles), consumers increasingly crave ways to identify with 1) physical brands, 2) physical/ephemeral experiences, and 3) their local/smaller communities and their local brands. 

While this can take many shapes, we see the potential to build a significant business around that and keep our eyes open for them. To give a few examples: 

  • Next-generation sport leagues

  • Stong local restaurant brands and emerging subscriptions, events, etc.

  • Increased inflow into mega-churches that offer smaller group gatherings 

  • Local Fashion Brands (e.g., Bandit)

  • Athlete/chef retreats (e.g., Adam Ondra clinic; Mads Mikkelsen Japan Trip) 

  • Running clubs for dating

  • ...

That being said, there are some structural challenges around how scalable these things are and to what extent they are venture cases.


July

Thesis

LLM-enabled Toys (Care Companions)

LLMs are enabling novel embodied AI use cases.

December 2024

Fundamental Consumer

LLMs are enabling novel embodied AI use cases. We expect that it is a high probability that in 5 years, most toys, from stuffed animals to action figures to Barbies, will have some kind of LLM-enabled voice capabilities. We see a few benefits associated with these LLMs: 

Naturally, we believe that data privacy and safety are crucial to these toys being beneficial and successful. Therefore, we believe them to have the following properties: 

We see an interesting opportunity for a commercial player to emerge here. Specifically, we see an opportunity to build an operating system that meets the standards above and enables owners of IP and distribution to build on top. In addition, we see significant opportunities to extend this platform in other areas, such as elderly care.


July

Thesis

Outcome-Based Pricing

A dominant narrative around how economic models will shift in response to AI is that companies can now “sell the work

December 2024

Infrastructure Tech

A dominant narrative around how economic models will shift in response to AI is that companies can now “sell the work” – replacing seat-based pricing or subscription fees with models more directly tied to the value they create. This trend mirrors the evolution of digital advertising, where sophisticated attribution and optimization layers emerged to maximize and capture value.

Early evidence of this transformation is showing up in software verticals with well-defined (and highly measurable) workflows. In October, Intercom reported that 17% of recent software purchases included outcome-based pricing for their AI capabilities, up from 10% in the previous six months. One customer using Intercom’s “Fin” chatbot, RB2B, said the system autonomously resolved 60% of customer support tickets in August, saving 142 hours of human work. At $0.99 per resolution versus $10 for human handling, this represents both dramatic cost savings and a new pricing paradigm tied directly to outcomes.

As AI capabilities accelerate, we expect a rapid build-out of supporting infrastructure focused on enabling and capturing this value creation and cementing this new economic paradigm. The demand side is already primed – companies face increasing pressure to deploy AI in high-ROI use cases, knowing their competitors (or new AI-native entrants) will if they don't. 

This dynamic is driving the emergence of several distinct outcome-based business models:

  1. Full-stack players aiming to fundamentally reshape the economics of critical industries (particularly those resistant to new technology adoption) represent the purest path to AI-driven outcome-based pricing. Companies like KoBold in mining aren't simply delivering a point solution to an existing value chain – they are using AI to transform how value is created and captured across the entire workflow. In doing so, they take on the full risk/reward that comes with attempting to reorient the economic structure of a deeply entrenched system. Similar opportunities exist in healthcare, where AI-driven approaches could dramatically reduce cost structures while improving patient outcomes, and in commercial real estate, where end-to-end platforms can reshape everything from building operations to tenant experience to energy management.

  2. End-to-end workflow solutions in well-defined/quantitative areas like sales (Salesforce) or customer service (Intercom, Zendesk). Here, we believe emerging AI-native players face a significant uphill battle. Incumbents that cover multiple steps of a company’s workflows have data, distribution, and value attribution advantages while more companies are pursuing internal builds through "spontaneous software" tolling or by leveraging commodity infrastructure (LLMs) to develop custom solutions – as Klarna recently did to great fanfare and apparent success. The company’s OpenAI-powered chatbot is “doing the work of 700 people” as it handles ⅔ of the company’s service interactions. 

  3. Infrastructure players are emerging to accelerate the adoption of outcome-based business models for AI services. We see opportunities for new solutions to handle attribution (measuring AI's impact across complex workflows), market-making (matching AI capabilities to business problems while optimizing for ROI), and financial infrastructure (enabling novel pricing structures). The parallel to mobile advertising is particularly instructive – companies like AppLovin didn't just facilitate transactions, they fundamentally transformed how value was created and measured in their market. These infrastructure players won't just serve existing markets – similar to Stripe in software, they'll expand the opportunity by making it possible for new types of AI services to emerge and scale.

  4. We also expect to see the emergence of teams that develop superior "process power" in AI implementation. Similar to how some organizations mastered lean manufacturing or agile development, these teams will systematically identify industries where AI can collapse cost structures (while maintaining value delivered), rapidly prototype and deploy AI solutions that replace expensive, manual workflows, and build durable institutional knowledge about which AI approaches work best for specific business problems.

    One way of thinking about this opportunity is as a modern version of Rocket Internet or Thrasio, but instead of geographic arbitrage or aggregation plays, they'd specialize in AI-driven transformation of stagnant sectors via an integrated product and go-to-market engine that allows them to capture a commensurate share of the value they create in an ecosystem. Perhaps a more ambitious framing is that a new class of private equity giants will emerge from this paradigm of buying and improving software and service businesses with AI (i.e. modern Constellation Software). 

Unsurprisingly, we believe the most attractive opportunity lies not in incrementally improving existing services with AI, but in fundamentally reimagining how industries operate. This leads us to two areas specifically that we are most intrigued by:

  1. Infrastructure providers enabling more precise outcome measurement, verification, optimization, and value capture across the AI services economy.

  2. Full-stack players who combine AI capabilities with deep domain expertise to fundamentally transform industry economics.

July

Thesis

“Scale as a Service” for the Bio-Industrial Economy

Over the past 25 years, the emergence of "scale-as-a-service" has powered multiple "invisible innovation" waves in the software world.

November 2024

Infrastructure Tech

Over the past 25 years, the emergence of "scale-as-a-service" has powered multiple "invisible innovation" waves in the software world. Infrastructure begets applications begets the need for more infrastructure, and so on. Platforms like AWS, Shopify, Stripe, and Twilio build scale on behalf of their customers in important but non-core, functions and enable access via API. Over time, emerging companies bundle infrastructure from various scale-as-a-service providers, making it possible to go bigger faster or target previously unaddressable niches. Thanks to the programmatic nature of interactions, scale-as-a-service solutions minimize coordination costs and maximize control, enabling precision execution that aligns with a company’s desired speed, scale, and budget.

As scientific breakthroughs make biology more programmable, the why now for Scale-as-a-Service models is approaching a tipping point – but with important nuance. While AI represents a powerful enabler of new product and process design, the reality of biological complexity means we first need better tools and data to model and manipulate processes. As Niko McCarty notes, even the most significant AI breakthrough, AlphaFold, reveals the gap between static prediction and biological reality. Scale-as-a-Service providers can help bridge this gap by industrializing high-quality, standardized data collection across the design and production process. A 2023 study of biomanufacturing bottlenecks found that companies consistently struggle with purification, continuous processing, analytics, and expertise gaps - all areas where specialized infrastructure providers can play a Shopify-like role.

Meanwhile, dominant macro trends like the energy transition and US-China competition are pushing companies and countries towards more sustainable and domestic production models. Half of the world’s largest companies are committed to net zero, “reshoring” continues to grow in popularity on earnings calls, and the Biden Administration has set targets like producing 30%+ of the US chemical demand via biomanufacturing pathways within 20 years.

While first-generation companies like Ginkgo and Zymergen have struggled massively, select second-generation companies like Solugen show signs of staying power. If (still a big if) these next-gen companies prove the economic viability of bioproduction, we expect to see several successful scale-as-a-service providers emerge. These companies will become foundational platforms for trillion-dollar industries like chemicals, materials, energy, agriculture, CPG, and food & ag where bioproduction remains pre-commercial scale. Like the internet, the invisible innovation waves created by this infrastructure application cycle may show that the market for bio-enabled solutions is larger and more diverse than we could have imagined a priori.

We expect most successful scale-as-a-service providers to start with asset-lite approaches. Expanding upon Chris Dixon's "come for the tool, stay for the network" insight, these companies will initially aggregate supply, demand, and attention through useful data and coordination tools. From there, they will evolve into market orchestrators, connecting buyers with sellers and unlocking new capital flows. Eventually, many will build out physical infrastructure at scale, becoming the operating systems of the bio-industrial economy.

July

Thesis

Prediction Markets

Prediction markets represent a fundamental transformation in how society aggregates and values information.

November 2024

Infrastructure Tech

Prediction markets represent a fundamental transformation in how society aggregates and values information. As trust in traditional institutions continues to erode, prediction markets will emerge as an efficient mechanism for pricing risk, surfacing truth, and reshaping decision-making across the economy.

Throughout the history of technology – particularly the internet – important platforms often begin in legal grey areas, where user demand outpaces regulatory frameworks. From eBay to Uber, Airbnb, and Spotify, the most impactful companies solve problems that existing systems cannot address – or, more precisely, where prevailing incentive structures baked into law by incumbents actively resist progress. 

While incumbent resistance will be significant, we believe there is an opening for new mechanisms of collective intelligence that align incentives toward accuracy and accountability.

This transformation aligns with our broader theses around the role of better data (often physical data) driving a shift toward more dynamic and precise information-centric business models. In the same way that pricing for digital tools is evolving from static per-seat licenses to value-based outcomes, prediction markets represent a step-function improvement in how we price and distribute information. Once people experience the power of real-time, market-driven signals – whether in election forecasting or project management – we see no going back to traditional polling or planning systems. Thus, we believe the long-term opportunity extends far beyond gambling or speculation – it's about fundamentally improving how societies and organizations make decisions and allocate resources. 

Amidst the “moment” prediction markets are having in the wake of the US presidential election, critics rightly point to fundamental challenges: the subsidies required to bootstrap liquidity in niche markets may prove prohibitively expensive, and many use cases beyond elections and sports could struggle to attract meaningful participation. While these are serious concerns, we believe they echo historical skepticism of other transformative markets. For example, at the outset of equity markets, stock trading was seen as gambling and was dominated by "bucket shops" where people placed bets on price movements without owning shares. Such activity was seen as zero-sum, manipulated, and socially destructive. Yet over time, infrastructure emerged to make securities trading safer and more accessible: mutual funds, for example, transformed speculation into investment, regulations built trust, and exchanges standardized trading.

A similar story played out in e-commerce. In the mid-1990s, conventional wisdom held that consumers would never trust online platforms with their credit card information. Amazon launched in 1995 to widespread skepticism, yet by creating infrastructure that built trust and reduced friction, e-commerce became not just accepted but essential. 

Our hypothesis is that we are in the 1995 - 2000 period for prediction markets – mass-market cultural awareness is growing and momentum is clear – but market penetration is little more than a blip in the overall picture. In the same way that mobile devices and social networks (among other things) provided the technological catalyst for deeper e-commerce penetration, we see AI (and AI agents) as a critical enabler of the next wave of prediction market growth. For example, by creating liquidity in thousands of micro-markets, AI has the potential to help users take more sophisticated portfolio approaches and contribute to a “utilization cascade” that shifts prediction markets from perceived gambling into new “standard” tooling for information discovery.

Success in this category will likely follow e-commerce's growth trajectory. While early adopters drove initial growth, widespread adoption required infrastructure that created trust and reduced friction. Today's prediction market leaders will similarly need to build both consumer-facing brands and backend capabilities. We expect to see an "Amazon of prediction markets" emerge – potentially Kalshi – that combines direct consumer reach with infrastructure that powers other platforms. This will enable an ecosystem of niche players targeting specific verticals or user segments.

A key question remains around where value ultimately gets captured. Just as e-commerce value accrued to new platforms, internet-native brands, and incumbents who successfully adapted (e.g. Walmart), prediction market infrastructure will create several winning archetypes. Beyond pure-play platforms like Polymarket, existing media and financial platforms that already own distribution – from ESPN to Bloomberg – could emerge as powerful players.

The opportunity extends far beyond any single vertical. By expanding the surface area of possible transactions, prediction markets could enable new types of information exchange that are hard to imagine a priori. Winners will start by dominating specific verticals where information asymmetries create clear value propositions (Books:Amazon::Elections:Kalshi), then expand into adjacent use cases as users become comfortable with the model. Those who can navigate the regulatory environment while building trusted brands will become essential infrastructure for the information economy.

July

Thesis

AI-driven SaaS Replacement

LLMs have started and will continue to bring down the costs of writing software.

November 2024

Infrastructure Tech

As we discussed in many other category thesis, we believe that in the AI era, many of the laws of physics that existed around technology and business models are changing and much has been written about the proclaimed ‘End of Software.’ The argument goes something like this. 

LLMs have started and will continue to bring down the costs of writing software. This leads to a world where software is increasingly created for N of 1 customers and will be easily mendable over time. Ideating and building (or prompting) this software will increasingly shift from developers to non-technical users. 

As software creation becomes cheap, this poses a challenge to traditional software companies whose core value proposition (development, maintenance, and hosting of software, customization, and customer support), business model, and moats are rooted in the ability to leverage initial investments into brands, standards, and free cash flow into features and app ecosystems that catered to a heterogeneous customer base with little incentive to go elsewhere due to switching costs. Switching becomes substantially more attractive if the ‘perfect,’ highly personalized software is the alternative. This fundamentally challenges the business model of these companies. 

With that established, the key question is what the new paradigm might look like. 

There is a vision that if LLMs and agents have access to all our data, software and interfaces will be generated in real-time, on demand, and only emerge when they are needed. Fully in the control of users (or their LLMs/agents), this software costs only as much as the computer required to build it. While this vision is undoubtedly appealing, there are a few missing pieces: 

For one, we assume that it will take some time for models to generate end-to-end software applications. Until this is possible, someone needs to be responsible for ensuring the software works. This is not only from a technical perspective but also from a usability perspective. Just because a feature can be easily built doesn’t mean it should be built. Until models can fully understand context (at which point it is questionable why there would be even a need for human-readable software), domain-specific product expertise will be required to build useful products for specific use cases. Moreover, customer support and the need for enterprise customers to want somebody to lean on when things go wrong will likely remain.  

As a result, we believe there is an opportunity to build a company here. This company will have certain features: 

  • Provide a platform that offers guidelines to non-technical users to create applications for their specific needs 

  • Have an in-house team of developers to guarantee that software is functional when LLMs fail 

  • Create a Platform / App Store-type thing that enables the developer to publish their applications and enable others to use them 

  • Data platform and SDKs that enable matching to good features either developed already or novel and easy integration of these features

  • Business Model: 

    • Initial Development - one-off 

    • (Data) Platform - ongoing 

    • Developer Platform / App Store - marketplace take rate

July

Thesis

AI-Driven CAE

Modern CAE's transformation combines AI and deep learning, drastically improving physical design efficiency, creating opportunities for new hardware-makers, and challenging established OEMs.

October 2024

Industry Transformation

Computer-aided engineering (CAE) tools are the backbone of modern product development. As a result, they underpin much of our modern physical environment. Today, several key catalysts are shifting the playing field and creating long-awaited opportunities for new physical design companies to emerge and scale. 

  1. There is immense potential to commercialize the growing body of relevant research.  Computer-aided engineering (CAE) traditionally utilizes numerical solvers to simulate and analyze complex engineering problems (e.g., Finite Element Analysis (FEA), Computational Fluid Dynamics (CFD), and Multibody Dynamics (MBD)). Depending on the complexity of the problem and the computational resources available, CAE simulations can take anywhere from a few minutes to several days or weeks.

    In recent years, there has been increasing research done in training deep-learning models on simulation data to create so-called deep-learning surrogates. Deep learning-based surrogate models are computational models that leverage deep neural networks to approximate complex physical systems or simulations, providing fast and efficient predictions while maintaining reasonable accuracy (i.e., run complex simulations in seconds). Methods include data-driven (e.g., GNNs, NOs, RNNs) and physics-driven (e.g., PINNs) deep learning and generative models. Technology maturation makes the opportunity ripe for the right team, with access to data, and the ability to learn from a constant feedback loop of testing these models against a verifiable physical reality to push the boundaries of these methods. Combining these into easy-to-use workflows can fundamentally change how engineering (simulation, in particular) is done at scale. 

    An example from research by McKinsey on this: "One company in the power-generation sector, for example, used the approach to optimize the design of large turbines for hydroelectric plants [...] the company reduced the engineering hours required to create a new turbine design by 50 percent and cut the end-to-end design process by a quarter. Better still, the approach generated turbines that were up to 0.4 percentage points more efficient than conventional designs". This is the type of upside that is necessary to get the attention of potential customers in the space. 

  2. We are in the early days of a hardware supercycle. The top-down push by Western economies to reindustrialize and redevelop production capacity in the name of national security, climate change mitigation, and economic growth has driven capital and talent toward physical world innovation. With role models like Tesla, SpaceX, and Anduril leading the charge, hundreds of well-funded aerospace, energy, manufacturing, robotics, and transportation companies are emerging with demands for a modern physical design and engineering stack. This increased competition is pushing incumbent OEMs to experiment with new solutions for the first time.  

  3. AI-native business models create a wedge for new players. A shift in business models has thus far marked the deployment of advanced AI into foundational industries – KoBold owns its own exploration assets while Isomorphic Labs shares in the economic upside of its IP. Similar value-based, consultative, and/or “forward deployed” approaches – a partner rather than software vendor relationship – could create the footing for new players to gain footing with large customers and expand over time, avoiding the all-or-nothing sales cycles that have long protected existing leaders. 

The combination of evolving customer demands, novel research, and new business models have formed the basis for an entirely new paradigm of computer-aided, AI-driven design and engineering tools. They are unlocking faster, cheaper feedback loops, shifting workflows from linear to parallel, and unlocking emergent use cases. This increases both speed and quality in a way incumbents struggle to defend against. 

July

Thesis

The Manufacturing Action Layer

As the cost of adding sensors to anything and everything in the manufacturing process has decreased significantly, the amount of data produced in the factory environment has exploded.

October 2024

Industry Transformation

The anticipated (and much-needed) manufacturing renaissance in the US and Europe – sparked by rising competition with China and a movement to invest in expanding domestic production capacity in the wake of pandemic-induced supply chain breakdowns is hampered by several deficiencies. Among these limiting factors is the challenge of turning vast amounts of disparate industrial data into actionable insights and execution that drive true productivity gains

As the cost of adding sensors to anything and everything in the manufacturing process has decreased significantly, the amount of data produced in the factory environment has exploded. However, conversations with people across the manufacturing landscape make it clear that the impact of production digitization continues to underperform expectations. 

More than a decade into the Industrial IoT wave, most data from the manufacturing process ends up – at best – brought together into static Excel files. And while platforms like Palantir’s AIP promise rapid transformation, the ground reality is that data from different systems continues to live only in the heads of individual operators – a critical risk in an industry with massive turnover and an aging workforce. The VP of Industry 4.0 at a ~ $5b market cap automotive supplier recently remarked that they still lack the visibility to know whether a machine is even running in a factory without calling someone on the ground.

Incumbent software offerings in manufacturing are often stitched together over years (even decades) of acquisitions and integrations, resulting in a mess of fragmentation technical debt, information silos, and process bottlenecks. 

Given this backdrop – and the macro tailwinds that will continue to drive demand for next-gen manufacturing solutions – our initial hypothesis is that there are two interesting angles of attack for new companies: 

  1. Modern industrial control and execution systems capable of aggregating data across modalities and business systems, automating mission-critical operation and production activities, and assuming responsibility (via outcome-based pricing models) for driving efficiencies.

  2. Software-defined manufacturers aiming to displace incumbent manufacturers entirely through more efficient end-to-end approaches in specific verticals/use cases. 

Both models face challenges. The “base rates” for selling impactful digital solutions to manufacturers are mediocre at best and the companies that have reached scale – platforms like Cognite, Palantir, and Samsara – have significant distribution advantages that must be overcome by more narrow emerging entrants. For the “full stack” players, the scale potential is clear but remains to be seen whether venture capital is the right financing tool (“CNC machines with startup branding” is how one person described one of the companies to us).

July

Thesis

AI-enabled PCB Automation

It is a recurring meta-theme that we think AI will have a great impact on the physical world.

September 2024

Industry Transformation

Working Thesis

It is a recurring meta-theme that we think AI will have a great impact on the physical world. At the same time, we are convinced that companies that innovate around business models and take ownership of certain processes will unlock a lot of value, maximizing the value capture associated with their technology. 

One area that has caught our attention in this context is AI-enabled PCB layouting. Printed Circuit Boards (PCBs) are the backbone of modern electronics, enabling a wide range of devices across various industries. In consumer electronics, PCBs power smartphones and smart home devices, enhancing our daily lives. The medical field relies on PCBs for critical equipment like MRI scanners and pacemakers, improving patient care. Automotive applications include engine control units and advanced driver assistance systems, making vehicles safer and more efficient. In aerospace and defense, PCBs are crucial for avionics and satellite communication. Industrial settings benefit from PCBs in robotics and automation systems, while telecommunications infrastructure depends on them for routers and cell towers. From the devices in our pockets to the satellites orbiting Earth, PCBs play an indispensable role in connecting and powering our technological world. As the complexity of end devices increases, so does the complexity of PCBs. 

The increasing complexity of PCB layouts makes the design more challenging due to higher component density and miniaturization, which require intricate placement strategies and precision routing. Managing multiple layers and implementing high-speed interfaces demand careful signal integrity analysis and tighter manufacturing tolerances. Integrating mixed technologies complicates the design process, requiring effective partitioning and thermal management. These factors necessitate advanced skills and sophisticated tools to ensure that designs meet performance and manufacturability requirements. That said, as shown in the table below (Source: Claude), the processes associated with correctly laying out a PCB already take around  50%+ of the total time of PCB development today. We expect this to increase due to the described complexity of PCBs to keep pace with the novel applications we need them for.  

It is our current assumption that increasing complexity will have a disproportionate impact on the effort and time it takes to create these layouts. Other than schematics, this seems to be a very straightforward task requiring little strategic context. A PCB layout either works or does not based on certain benchmarks, whereas schematics can be more ambiguous. We have seen significant progress in AI model development (especially reinforcement learning), which can automate and significantly accelerate parts of the PCB layout process.

The total number of PCB designers in the United States is 72,971, with an average salary of around 74k per year. This gives us a total salary of USD 5.4B for PCB designers. Automating a significant part (70+%) of their jobs offers considerable cost savings. Of course, this does not include any economic benefits associated with AI models accelerating the process and substantially reducing the number of hours. This is especially valuable at the higher end (e.g., aerospace, defense), where PCBs are highly complex and take orders of magnitude more time to design. Acceleration of parts on the critical path into production is likely precious and hard to quantify based on cost-saving numbers.

We have spent significant time thinking about the opportunities of AI-enabled outsourcing and services business and believe that PCB layouting provides the structural environment for such a model to emerge. 

  1. Objective benchmark assessments 

  2. Clear benefits to assuming responsibility for working output

We believe that the company capable of driving significant improvements here can build a large company with a wedge into a market that is otherwise hard to penetrate for software companies due to the dominance of Altium and others. 

July

World View

European Defense

A new era of strategic autonomy and societal resilience

August 2024

Industry Transformation

With war on its doorstep and American attention shifting to the Pacific, Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century. While Russia’s invasion of Ukraine exposed Europe’s lack of preparedness and home-grown capabilities, the conflict has shifted the perception of European builders, investors, and policymakers on the importance (and ethics) of developing and deploying critical technology to foster sovereignty.

The result has been a groundswell of momentum aimed at transforming Europe’s defense-industrial base; protecting European values by deterring Russian aggression in the near term and building the long-term foundations to project strength amid rising Great Power Conflict. 

In recent years, change has occurred at all levels – from the EIB’s updated views on defense technology and the European Commission’s first-ever Defence Industrial Strategy to the rise of an actual defense technology ecosystem in Europe for the first time, catalyzed by the momentum of lighthouse companies like Helsing, ambitious efforts like the €1b NATO Innovation Fund, and grassroots organizations like the European Defense Investor Network.

But expanding markets, increased capital flows, and narrative momentum don’t always imply attractive forward-looking returns. 

Despite the market’s growth, inertia, fragmentation, and protectionism rule the European defense market. While European defense spending has returned to Cold War levels, the continent still lacks urgency relative to geopolitical allies and rivals. The conflict in Ukraine has done little to unite European perspectives on the what, how, and who of building next-generation defense capabilities. The EU’s two largest economic and military powers – Germany and France – remain fundamentally split on the role of Europe in its own defense. This philosophical gap threatens to worsen the severe physical fragmentation of European defense forces – Europe operates 5x the number of vehicle platforms than the US. At the same time, the UK has increasingly shifted attention away from the continent towards the AUKUS coalition. 

The US defense technology ecosystem, far more developed than Europe, inspires little confidence in what lies ahead. Through September of 2023, venture-backed companies were awarded less than 1% of the $411 billion in Defense Department contracts awarded in the government’s fiscal year – only a slightly larger share than in 2010 when few startups were building military technology. And while companies like Anduril have shown that the path to scale is possible, the company’s success may end up making it the new technology distribution chokepoint instead of a bellwether for a thriving new defense ecosystem. 

These factors present significant obstacles to building and scaling European defense technology companies. They may also present unique opportunities for a highly targeted investment approach in the space, aimed at turning the market’s weaknesses (e.g. fragmentation) into strengths and riding key catalysts that may help emerging companies overcome the inertia and sub-optimal industry structure.

Catalysts Creating Opportunity

To believe in the opportunity to invest in emerging European defense technology companies in the face of the incumbent market structure, we need to see significant technological, economic, social, and policy-related shifts that are, critically, available to emerging entrants and not incumbents.

Europe has been forced to prioritize resilience and strategic autonomy for the first time in a century. This has restructured the continent's procurement regimes, capital markets, and attitudes. The simple availability of early-stage capital for emerging defense and security companies in Europe cannot be overstated. With dozens of early-stage funds now focused exclusively or significantly on the space and later-stage investors slowly showing up, financing a defense technology company in Europe to the point of scale is now possible. As the EIB and other capital markets institutions continue to evolve their view, we expect many of the capital markets barriers to financing defense and security companies across the company life cycle will begin to fall away.  

Procurement remains a significant challenge, but the fragmentation of Europe creates opportunities for emerging companies to get to market faster by targeting smaller, potentially more agile countries more inclined to adopt new solutions. Greece, for example, now spends 4% of GDP to support a tech-forward defense strategy while countries close to the front in Ukraine have been forced to move quickly to adopt new solutions. 

The “primitives” for rapid, capital-efficient defense technology development have come online, making it possible for companies to ride multiple technological tailwinds to build solutions that meet the complex needs of government customers. Decreasing costs of hardware, enabled by advanced manufacturing, better (software) tooling, and the acceleration of physical-world foundation models make it possible for companies to develop complex defense technology systems at a fraction of the cost of incumbents. AI systems are already operating successfully in significant tests (like dogfighting with fighter pilots) and on the battlefield in Ukraine, which should drive more receptiveness from (risk-averse) buyers and users. 

Lighthouse companies and talent ecosystems are emerging around defense and national security-focused technology for the first time in Europe. The US defense technology ecosystem built momentum on the back of breakthrough companies like SpaceX and Palantir. The same pattern is playing out in Europe, with companies like Helsing and The Exploration Company forming the foundation for a thriving defense-industrial ecosystem in Munich. While less developed in terms of defense and space-focused technology, places like Stockholm (energy) nd Paris (AI) have become beacons for talent in areas adjacent to national security. Meanwhile, London has captured much of the early-stage energy and likely represents a strong ecosystem to launch a defense technology company thanks to its physical proximity to Europe and cultural proximity to the US.  

The Ukraine conflict has presented a unique opportunity for companies to develop proof points and revenue, creating a “backdoor” for future contracts with Western European governments. It has also highlighted the future of warfare. Rapid acquisition and deployment processes in Ukraine have helped companies generate real revenue and test systems in live situations. While larger Western European governments have been slower to respond, and more likely to simply drive business to existing primes, the proof points being developed by emerging companies should help their cases in (eventually) accessing larger, longer-term programs. Technologically, the predominance of electronic warfare has given a fundamental advantage to agile companies that can iterate rapidly to stay ahead of Russian competition. 

Key Insights

The following factors are the most significant in driving success for emerging European defense technology companies. These lessons are drawn from the company and our customer/expert interviews.

New defense companies are software-first, R&D-centric, and mission-driven. Incumbent defense contractors operate on a cost-plus business model, essentially building to the specifications laid out by government buyers and layering a “reasonable” margin on top (5% - 15%). As a result, large primes spend less than 3% of their budget on R&D and lack the incentive to innovate. On the other hand, companies like Anduril and Shield AI take on product development risk themselves and spend massively on R&D

And while the hardware these companies build tends to garner the most attention, the software and autonomy systems underlying the hardware make everything work. Anduril’s Lattice platform ties together all of the company’s hardware products, fusing sensor data and creating an autonomous operating picture. This software-defined operating model drives better margin structures (Anduril targets 40% gross margin vs. Lockheed and other primes under 20%), allowing them to continue fueling an R&D flywheel.

Fragmentation remains the most challenging aspect of European defense. It may also present the largest opportunity. Europe’s fragmentation challenge needs little additional explanation. There is not one unified military-industrial complex on the continent, there are 27. Each has a different view on supporting its own national champions, different relationships with EU member countries, and divergent views on buying from outside (usually the US). This has resulted in a complex web of disparate capabilities (weapons systems, vehicle platforms, and communication models) that limit rapid response and collaboration.

Understanding this, and realizing that it is likely beyond the reach of one company to solve from a hardware (let alone cultural) perspective, is key to uncovering where opportunities sit. Helsing, for example, has leveraged its positioning as a multi-domain AI backbone to build early leadership around this concept. As cheap drones, cameras, and other sensors proliferate, the opportunities to coordinate the complex data and operational picture, solving capability and collaboration gaps through more modularity and interoperability become larger.

Technology differentiation is table stakes. The most successful companies will possess a “secret” to navigating procurement. Despite the shroud of complexity surrounding defense procurement, success remains largely “champion-driven”, as Anduril CRO Matt Steckman recently remarked. Companies don’t win through better technology, they win by solving specific problems for people with influence in the buying process. Companies must simultaneously engage in long-term relationship building (including lobbying) to build trust with procurement influencers while developing relevant proof points in the field. One way of doing this, as Anduril demonstrated and emerging European players like Lambda Automata are attempting to replicate, is by viewing defense and security as a “conglomeration of micro markets” – which includes adjacent opportunity areas like public safety and border control.

Narrative momentum is highly rated but likely remains underrated by European founders. The traditional stereotypes of European vs. American founders seem to have played out in the early part of this new defense tech wave – from Anduril’s narrative mastery to the momentum of ecosystems like El Segundo to the sophisticated way some companies and investors have integrated themselves into the Washington decision-making systems. As in all markets, there is a reflexive nature to success in defense – the companies that win figure out how to craft a better story and more social proof to attract capital and talent in advance of fundamental traction. 

Distribution bottlenecks inherent in government and defense contracting are already contributing to market consolidation for emerging defense technology companies. Competing against defense primes means eventually competing in every domain they operate in. As software-first companies break through, the returns to scale and breadth might become even greater – platforms like Anduril’s Lattice get stronger as they consume more data and control more hardware assets in the field. Combined with the defense market’s natural bent towards consolidation, companies that can be “first to distribution” in a given area will be very hard to displace and will be strongly positioned to roll up interesting technology and talent, as Anduril has already started to do aggressively. (The sheer number of Anduril references in this document reflects its outsize and rapidly compounding success in this space!)

Emerging Investment Areas

There are several valuable defense market maps and landscapes worth evaluating to understand different ways of breaking up the market, perhaps the most comprehensive is this from Quiet Capital’s Michael Bloch: National Security and Defense Market.

To avoid rehashing those efforts, our focus has been on identifying emerging themes that span multiple segments of such maps, supported by converging market, technology, and geopolitical tailwinds. While not comprehensive, these themes align well with the catalysts and insights above and are where we have seen several of the most interesting companies in our review – the most interesting companies tend to touch multiple themes. 

Modularity and Interoperability → Leaning into the fragmented nature of European defense through solutions that aim to unite disparate operating systems and coordinate complex environments. While software capabilities will be the core connective tissue, hardware plays a big role as well. Cheaper, smaller, interoperable systems built to be easily adopted (both budget and technology-wise) can help accelerate initial deployment and provide companies with a platform from which to expand. 

Rapid Response → Building a more dynamic defense-industrial base by shortening time and cost to intervention across domains and operating areas. This ranges from faster kinetic capabilities (e.g. hypersonics and electronic warfare) to rapid manufacturing capabilities (e.g. Replicator) to faster deployment of machines and people (e.g. counter UAS swarms, labor coordination platforms) to systems that can be deployed (and as importantly, replaced) quickly. 

Multimodal Physical World Data and Intelligence → Wayve’s recent autonomous driving demonstrations showcased the speed at which multi-modal models are making their way into the physical domain. Along with the rapid decline of hardware costs, models that can reason more dynamically create interesting opportunities in defense, where operating environments are extremely fluid (i.e. not repetitive like pick and place, etc.) and thus pose problems for more rigid AI systems. Better simulation data will also continue to play an important role in preparing autonomous systems for live action. This represents a more horizontal theme and is thus something we might pursue a deeper dive into beyond defense.

Software for Hardware → The declining cost of hardware also creates room for better tooling, both at a collaboration/workflow level (i.e. Atlassian and GitHub for hardware builders) and at a design level (i.e. better CAD/EAD, “Figma for chips”, etc.). Fusion, a software platform developed and externalized by space launch company Stoke, highlights the need for better tooling to serve the hardware revolution. Enhanced IP and data security along with high levels of required precision for certain use cases may create specific opportunities in defense.  

Maritime Production, Security, and Infrastructure → Control over maritime infrastructure represents a significant geopolitical and economic advantage. Over the past decade, China has invested heavily in shipbuilding capacity. Today, a single shipyard in China has more production capacity than the entire US shipbuilding industry. However, the importance of maritime control goes beyond just shipbuilding. Undersea cables, for example, are the backbone of the global financial and communications systems. – over 95% of the world's communications are carried by a network of roughly 500 cables laid across the oceans. These represent critical vulnerabilities that need to be proactively protected through better surveillance, kinetic deterrence, and cybersecurity technologies. 

Combatting Digital Authoritarianism → Control of the digital economy is highly centralized, with cheaper data processing and engagement-centric business models (i.e. advertising) feeding the strength of a small number of powerful companies and institutions. This has led to democratic deterioration and a loss of trust in key institutions. It also creates a more straightforward surface area for attack and manipulation by adversaries – spanning consumer-focused influence campaigns to corporate IP theft. Technology that empowers sovereignty over assets and information, increases privacy, and enhances secure communication and collaboration represents a somewhat orthogonal, and bottom-up, approach to investing in defense and security as the go-to-market model may be dependent on large-scale government procurement. 

July

Thesis

“Plumbers” of the Reindustrial Revolution

Like traditional plumbers, these companies are focused on high-stakes problems where failure carries outsized consequences.

January 2025

Industry Transformation

While neo-Primes and OEMs capture headlines and venture capital flows, specialized players solving critical service, infrastructure, and component-level problems will be fundamental to transforming the physical economy. 

We call these businesses the "Plumbers" of the Reindustrial Revolution because, like their namesakes, they occupy an unglamorous but essential (and hard to dislodge) position in their chains. These companies are modernizing playbooks pioneered by industrial giants: Westinghouse in critical components, Bureau Veritas in trust and data, Schlumberger in technical services, and Grainger in supply chain orchestration. 

Like traditional plumbers, these companies are focused on high-stakes problems where failure carries outsized consequences. Their businesses are built first on technical mastery and reliable execution, which fosters deep customer trust and loyalty. Competition remains limited not just because of technical complexity, but through the “niche” nature of their markets – rational actors won't deploy massive capital to displace established players in constrained categories like they might in unbounded markets. This creates a foundation for expansion into adjacent opportunity areas – deepening existing customer relationships or extending technical capabilities to expand TAM over time. 

A key theme across much of our research is how geopolitical competition is redrawing supply lines and catalyzing efforts to rebuild industrial capacity in Western markets. The existential threat motivating this has been a potential conflict with China. But even in a positive scenario where kinetic conflict is avoided – and even as “expected unexpected” events like DeepSeek’s R1 impact the Capex and energy equations – we (and others) believe the trend towards spending on reindustrialization will continue. 

Thus far, the narrative surrounding the Reindustrialization tailwind has primarily benefited companies at the "front end" – next-gen OEMs, new Primes, and companies building brands easily understood by generalist investors that control most of the growth capital in the ecosystem. This is reflected in valuations – the early champions have access to near-unlimited pools of cheap growth capital while earlier-stage players are priced at levels that assume near-perfect execution. While we share the market-level excitement about the new levels of scale the best companies in this market can achieve, we have been more circumspect in our approach to this category.

As competition continues to rise on the front end of the market, our hypothesis is that the most attractive risk-return opportunities will increasingly be found with the "plumbers”, which we see emerging across four primary categories:

Critical Components

Then → Westinghouse's air brake system, invented in 1869 as railway networks reached continental scale, transformed railroad safety and became an industry standard, which created the foundation for one of the largest industrial conglomerates of the 20th century.

Now → The new material, form factor, and communication requirements of modern aerospace and defense systems create opportunities for specialized component makers to become standards in critical subsystems, from wire harnesses to thermal management to energy storage. 

Trust & Data Engines

Then → Bureau Veritas built a global franchise by becoming the trusted verifier of maritime safety standards as international trade expanded rapidly in the 19th century

Now → The confluence of aging existing infrastructure and the need for new development creates opportunity at the intersection of novel inspection technology and data analytics to become the system of record intelligence for asset health, compliance, and built world capital allocation. 

Superdevelopers

Then → Schlumberger became indispensable by mastering the technical complexity of oil exploration and production when the petroleum industry was rapidly expanding into new geographies

Now → The energy transition as well as the emergence of “new Prime frontiers” (e.g. the Arctic and space) creates opportunities for companies that can i) develop proprietary technology suited for challenging environments, ii) develop project execution capabilities to integrate other solutions, and iii) master the regulatory complexity of operating in new areas. 

Supply Chain Orchestration

Then → Grainger was founded in the 1920’s to provide customers with consistent access to motors as both the consumer and industrial markets for automotive and other powered machinery exploded.

Now → Electrification and UAV growth are driving demand for components like batteries, which are largely controlled by China and at increasing risk of tariffs and blockades. This creates new opportunities to build marketplace infrastructure for “democratic supply chains” and better supply chain coordination

Across these different pathways, we think successful companies will share several characteristics:

  1. Natural capital efficiency and organic growth – Sharper focus avoids growth at all capital strategy and expansion plans, fostering a more sustainable model for sequencing market opportunities.

  2. Rational competitive landscape – Perceived (initial) market sizes typically don't justify massive capital deployment by new entrants or existing players, while technical expertise and regulatory requirements create genuine barriers and, in some cases, help companies aggregate a portfolio of “sub-scale monopolies”.

  3. Value accrues to expertise (i.e. Process Power) – Deep knowledge of specific systems, regulations, or technical requirements becomes more valuable as complexity increases and companies either work across a broader segment of the overall value chain or integrate deeper into customer operations. 


1. The EDA market is one of the best examples of this. Companies like Cadence and Synopsys are both ~ $80b and relatively insulated from competition b/c their TAM (as a % of the overall semiconductor market) and their cost (as a % of overall the overall semi conduction design and fabrication process) is small. From NZS Capital:

“As they're successful, they're able to layer on these new businesses that are really additive to the overall business. So they may not even be increasing in price, in a lot of cases, just selling more functionality, because chip designers need it. And it's a really important point to underscore that we're talking about this 550 billion TAM of semiconductors, and the TAM of devices on top of that is another step function. It's being enabled by this sort of 10 billion EDA TAM. It's really small, when you think about what they're delivering.”

“But the idea that more EDA could come in-house over time, it just seems really unlikely to me, in part, because it's just not a huge pain point for the customer. It's 2% of their sales, and they just get so much value for what they're giving, versus the effort to re-engineer all this stuff that's been created over the last few decades.”

2. Much like last decade where being the Uber or Airbnb for X was an unlock for high-priced early financing, the same is true today of companies promising to become the Anduril or Palantir for X.

3. This relates to our thinking on AI-enabled asset ownership/buyout opportunities.

July

Thesis

Transforming Clinical Trials

How can we massively speed up the timeline – and reduce the cost – of bringing new drugs to market?

January 2025

While the interplay of AI and better data is (finally) beginning to deliver on the potential of dramatically expanding the therapeutic opportunity space, these breakthroughs risk being stranded or significantly delayed without a transformation of the clinical trial process.

We believe several factors have converged to create an exciting ‘why now’ for new companies building new clinical trial infrastructre.  

  1. The post-COVID regulatory environment and evolved operating procedures have created a unique window for reimagining clinical trials. 

  2. Remote monitoring, decentralized trials, and real-world evidence have moved from fringe concepts to validated approaches.

  3. The explosion in AI-discovered therapeutic candidates is creating pressure to modernize trial infrastructure for both human health and economic reasons – it is estimated that the cost of clinical trial delays can be on the order of millions of dollars per day

Our initial hypothesis is that winning companies will possess the following characteristics.  

  1. Vertically integrated, building parallel infrastructure instead of patching the existing system. The complexity and interconnectedness of clinical trials mean that point solutions will struggle to drive meaningful change. For n-of-1 companies to exist in this space they need control over the full stack – from patient recruitment through data collection and analysis. This approach is about more than technological self-determination. It also positions companies to innovate on the financial model of clinical trials towards better alignment among all of the key stakeholders (i.e. risk/upside sharing).

  2. AI (and post-COVID) native, designing their processes around modern capabilities rather than retrofitting them onto legacy approaches. This means leveraging AI for everything from protocol design to real-time monitoring while embracing decentralized/hybrid trials and remote data collection as first principles rather than accommodations.

  3. Built to capture the growth of AI-driven drug discovery (i.e. new companies) rather than competing for share in the traditional clinical trial market. This allows them to sidestep entrenched competitors to work with customers operating with the same true north of speed and technical advancement.

July

Thesis

Off-Road Autonomy

Reversing this physical world stagnation represents one of the largest economic opportunities of the coming decades.

January 2025

Infrastructure Tech

The Western infrastructure crisis is about more than aging bridges and roads (and elevators) – it's about our capacity to build, maintain, and modernize the physical systems that underpin productivity, economic growth, and strategic sovereignty. From critical mineral extraction for the energy transition to military logistics modernization to the massive manufacturing capacity needed to achieve reshoring objectives, we face unprecedented demands on systems that have seen little innovation in decades.

Reversing this physical world stagnation represents one of the largest economic opportunities of the coming decades. This is reflected in our work from several angles – most notably our investments in KoBold and Gecko, and through category research into energy infrastructure, sustainable construction, and defense.

It is easy to blame this stagnation on a lack of investment or an absence of vision among industrial (and bureaucratic) operators. But these are symptoms of the fact that physical world modernization – both digitization and automation – is not a monolith and a vast majority of the work that needs to be done is a fundamentally harder problem than commonly understood.

The environments where we have most significantly slowed down and thus where we most need automation – sectors like construction and defense as well as settings like logistics yards – are characterized by high situational diversity: dynamic conditions, variable tasks, and diverse equipment fleets that often stay in service for decades. While continuous process industries like chemicals and manufacturing have made significant strides in automation, these high-diversity environments have remained stubbornly resistant to transformation.

Automating heavy industrial vehicles – earthmovers, mining equipment, military Humvees – represents an important step to mastering these environments and fundamentally transforming the productivity equation in these industries. While much of the discussion around physical world automation has centered on robotics or on-road consumer autonomy (Waymo, Tesla, etc.), these vehicles sit at the intersection by unlocking both autonomous mobility and task execution/manipulation capabilities. They are the workhorses of our industrial system, will continue to be for a long time, and are just now starting to become equipped for autonomous operation. 

"Today you have a few thousand [autonomous] vehicles in mining, you have a few hundred vehicles in ag, you have dozens of vehicles in other verticals. I think we're really at the starting line now. Ag, for example, is nearly 3 million tractors. Obviously only a small percentage of those are big enough or productive enough to be automated. In construction equipment there's a million plus units. You look at something like mining, there's something like 60,000 dump trucks. So those are your upper bounds. But today the biggest successes are in mining where you've got north of a thousand units deployed, which, when you compare to on-road, is in a similar realm." – Sam Abidi, Apex Advisors

Technology Tipping Points → Our robotics research leads us to believe that the category is approaching (or reaching) technological tipping points on several fronts. While on-road autonomy has focused on well-marked roads and predictable conditions, industrial autonomy faces fundamentally different challenges. These environments demand systems that can handle unstructured terrain, weather variations, and complex interactions between vehicles, machines, and humans.

Several technological advances are converging to finally make this possible: 

  • Visual language models (VLAMs) and advanced perception systems that can understand both geometric and semantic elements of complex environments

  • Mapless localization capabilities that enable adaptation to rapidly changing conditions without relying on pre-existing maps

  • Improved sensor fusion that can differentiate between traversable elements (like foliage) and true obstacles while understanding surface characteristics

  • Edge computing architectures designed specifically for ruggedized, industrial deployment

  • Robotic hardware improvements (e.g. dexterity) that can be incorporated into autonomous systems to unlock end-to-end operational capacity.

Talent and Capital Momentum → Along with the technological building blocks for this category, the talent seeds were planted over the last decade as capital and big visions fueled the first wave of autonomous vehicle company building. Frustrated by autonomy regulation and other bottlenecks, founders and engineers started to look for opportunity areas where product roadmaps – and commercial models – could be realized in 2-3 years rather than a decade. This led many to off-road autonomy – despite the much smaller TAM – and has led to a flurry of company formation and funding in the space. 

Investibility – From Action Layer to Systems of Collaborative Intelligence → Building on our thesis in vertical robotics, we see retrofit industrial vehicle autonomy as a powerful near-term lever for modernizing infrastructure. The economics are compelling: retrofit solutions can deliver substantial cost savings versus new autonomous vehicle purchases while allowing customers to preserve their existing fleet investments, which often have 15-20+ year lifespans.

We see a clear sequence for how companies build defensible positions in this category:

1. Action layer as a go-to-market wedge:

  • Target 80-90% automation of common tasks while preserving human oversight

  • Lead with collaborative service model combining autonomy systems with expert support

  • Focus on high-ROI use cases where service model can support end-to-end execution

2. Systems of Record

  • Proprietary datasets around vehicle performance, environmental conditions, and task completion

  • Fleet management and analytics capabilities that span multiple vehicle types/brands

  • Data-driven maintenance and operations optimization

3. Systems of Collaborative Intelligence

  • Coordination and resource planning across operators, vehicles, and robotic systems

  • Serve as integration layer for next-generation capabilities, whether built internally or via partners

  • Consider deeper integration (going beyond retrofitting) to increase system-level advantages


This follows the progression we expect to see Gecko take in the data-driven (and increasingly automated) inspection market and is being proven out now by off-road autonomy companies like Outrider, which has expanded from electric yard trucks using a patented robotic arm to a full suite of site infrastructure and logistics operations management systems. It is worth noting that we believe this same sequencing may not hold when selling to militaries who tend to be more concerned about vendor lock-in and thus less receptive to “operating system” style offerings. 

Still, we believe companies operating purely at the "action layer" will have limited long-term defensibility and will require companies to uplevel their capabilities over time. The path forward also likely includes hybrid models – as evidenced by Caterpillar and Teleo's approach of using remote operation as a bridge to full autonomy, allowing skilled operators to work from anywhere while systematically identifying repetitive tasks suitable for automation.

This progression allows companies to build trust through immediate value delivery while laying the foundation for deeper workflow transformation. The key is maintaining the flexibility to evolve alongside customer needs and technological capabilities rather than forcing premature standardization.

We are particularly interested in companies targeting:

  • Heavy industrial operations (construction, or mining, and agriculture depending on use case) where environmental variability is high but equipment standardization is low.

  • Military and defense logistics, which require operation across diverse terrain with mixed vehicle fleets.

  • Port and industrial yard operations, where dynamic routing and complex interactions between machines and humans are the norm.

This thesis faces two primary risks. First, a breakthrough in robotics foundation models could make the retrofit/incremental approach less compelling, though our discussions with leading robotics companies suggest they are not underwriting dramatic commercial-level breakthroughs on even a ~5-year horizon. Second, growing concerns about AI's impact on employment could spark regulatory pushback, though acute labor shortages in these industries create powerful countervailing forces.

Overall, we believe the combination of sensing, decision-making, and physical execution in high environments represents an attractive wedge to become industrial operating systems in several categories.

July

Thesis

Personal Security

The traditional concept of security, once firmly rooted in the domain of the state, is undergoing a significant transformation.

January 2025

Fundamental Consumer

The traditional concept of security, once firmly rooted in the domain of the state, is undergoing a significant transformation. Individuals are increasingly taking responsibility for their own safety and well-being, driven by a confluence of factors, including rising crime rates, the proliferation of cyber threats, and a growing awareness of the limitations of state-provided security in the digital domain. This shift is particularly evident in the digital realm, where the rise of sophisticated AI-powered scams and the increased abundance of personal data online (both shared knowingly and unknowingly) and its value have created a new era of individual responsibility. We believe that as individuals become more proactive in managing their own security, the personal security market is poised for significant growth, offering a wide range of opportunities for companies that can provide innovative and effective solutions.

This finds its manifestation in the proliferation of data breaches and spam calls, has become a major concern for individuals and businesses alike. In 2023, approximately 56 million Americans lost money to phone scams, with total losses reaching an estimated $25.4 billion annually. These scams often involve impersonating loved ones or authority figures, leveraging highly personal information to solicit urgent financial assistance or sensitive information.

This is exacerbated by the fact that scams and misinformation campaigns will only become more sophisticated from here on forward as they leverage AI-powered voice cloning and deepfake technology. This starts what we often refer to as an evolutionary arms race between the deceiver and the detector. In this environment of heightened risk and uncertainty, individuals take a more proactive approach to their security. 

Moreover, as societies become more polarized, personal information is easily accessible, and doxing becomes more prevalent, we see this sense of perceived risk to also spill over into the real world. 

We believe that the opportunity can take various forms. From cutting-edge digital identity protection solutions to counter deep fake solutions to physical home security platforms, personal security companies are leveraging technology to empower individuals and provide a sense of control over their safety and well-being.

July

Thesis

The Robotics Smiling Curve

Embodied AI reallocates value from hardware to intelligent foundation models and specialized vertical solutions, fueling leaps in productivity across complex tasks.

January 2025

Infrastructure Tech

Where will value flow as embodied AI takes off?

We are convinced that AI, deployed in robotics systems with the unconstrained ability to navigate and interact in the physical world, will be one of the biggest unlocks of productivity and abundance in our lifetime. The convergence of tumbling hardware costs, breakthroughs in AI, and mounting pressure for automation across critical industries has created an unprecedented opportunity for transformation in how physical tasks are performed.

What started 50+ years ago with the optimization of rote industrial tasks has evolved through distinct phases: first, the automation of controlled, repetitive workflows like warehouse pick-and-place operations, and now, the potential to handle end-to-end responsibilities in complex, multi-dimensional environments—from factory floors to healthcare facilities to homes.

This evolution comes at a critical juncture. Labor shortages in key industries, aging populations, and shifting supply chains in response to climate change and geopolitical pressures have created an urgent imperative for modernization. In industrial settings, where ROI drives decision-making, robotics solutions are already catalyzing larger automation budgets. In consumer settings, where emotional factors play a larger role, mounting evidence (e.g. Waymo adoption) suggests growing readiness for automation in everyday tasks.

As with any market opportunity, we are interested in understanding which technological and commercial capabilities are most scarce (and thus most valuable) and along with that, which parts of the value chain emerging companies are best positioned to win. 

Technological Tailwinds

The massive talent and capital flows into robotics over the past few years have been catalyzed by an unprecedented convergence of technological breakthroughs. This convergence is moving robotics from a hardware-centric paradigm (led by companies like ABB and FANUC) to one where intelligence and deep workflow integration capabilities drive market power.

At the core of this shift is the emergence of multi-modal foundation models that sit at the intersection of language understanding, vision perception, and spatial awareness. As DeepMind's Ted Xiao observed in his survey of 2023's breakthroughs, we're witnessing not just technological advancement but a philosophical transformation: "a fervent belief in the power of scaling up, of large diverse data sources, of the importance of generalization, of positive transfer and emergent capabilities."

This shift is backed by technological progress across several dimensions:

Transformer architectures have opened entirely new possibilities for how robots process and act upon information from the physical world. Projects like Google's RT-X and RT-2 and TRI's work on General Navigation Models demonstrate the potential for end-to-end, general-purpose automation of dynamic physical interactions. These advances are particularly powerful in their ability to turn abstract concepts ("verbs") into context-specific actions – understanding, for instance, the crucial differences between opening a door and opening a phone.

  1. The hardware equation is rapidly shifting in favor of commoditization and widespread deployment. The emergence of cheaper, modular components across perception (cameras, radar, lidar), control (motors, actuators), and power systems is making the economics of cognitive robotics increasingly viable. Companies like Unitree are demonstrating how quickly hardware capabilities can advance when paired with improving intelligence layers. Perhaps more importantly, as these intelligence layers improve, robots can achieve more with simpler hardware configurations – a virtuous cycle that further improves deployment economics.

  2. Advances in computing infrastructure, both in cloud environments for heavy workloads and at the edge for real-world autonomy, have expanded the frontier of possible applications. This is complemented by breakthroughs in simulation, synthetic data generation, and cross-embodiment learning that promise to help robotics overcome its historical data scarcity challenges.

However, these tailwinds – and the ability for companies to defend technological advantages – are not evenly distributed across the value chain. For this reason, we believe the Smiling Curve is a useful framework for understanding where and how value will accrue in embodied AI.

In short, we see the most value flowing to i) foundation/world models that can generalize across tasks and embodiments and ii) specialized applications that can leverage these capabilities to solve high-value problems in complex domains. The traditional middle of the value chain – hardware manufacturing and systems integration – faces increasing pressure as intelligence becomes more important than mechanical sophistication. Similarly, data generation labeling, and processing will also face downward pressure as big tech companies with ample access to data seek to drive commoditization to benefit other parts of their business (in robotics and beyond).

This creates two paths through which we believe emerging companies have the biggest advantage in sustainably creating value.

Robotics Foundation Models

Robotics foundation models have the potential to be the operating systems and action layer for the physical environment, transforming commodity hardware into real-world agents.

For RFM companies, we see “data gravity” as a key to success – the ability to create self-reinforcing loops where model improvements drive adoption, which in turn generates more valuable training data. Unlike language models, which could draw on the vast corpus of human-generated text on the internet, robotics models face a fundamental data scarcity challenge. Outside of self-driving vehicles, no one has accumulated the volume of real-world interaction data needed to train truly general models.

This scarcity creates a unique strategic opportunity. A company that can solve the data acquisition challenges through strategic partnerships and deployment models will build powerful network effects. As their models improve, they become more valuable to hardware partners and application developers, generating more deployment opportunities and thus more data – a virtuous cycle that becomes increasingly difficult to replicate.

Vertical Robotics: Deep Integration and Domain Expertise

At the other end of the curve, we see compelling opportunities for companies that can deeply embed robotics capabilities into important workflows in critical industries. These companies succeed not through general-purpose intelligence, but through their ability to solve complex, high-value problems. 

We believe vertical robotics approaches are most valuable where:

  • The workflows governing interactions between robotics and operational systems are highly complex

  • Social dynamics and regulatory requirements favor trusted brands with deep domain expertise

  • The cost of failure is high, creating strong incentives to work with specialists

  • Domain-specific data creates compounding advantages that are difficult for generalists to replicate

Companies like Gecko Robotics (July portfolio company) in industrial inspection exemplify this approach. Their competitive advantage stems not from robotics capabilities alone, but from the domain-specific meaning they extract from collected data. This creates a different kind of data moat – one built around understanding the nuances and edge cases of specific applications rather than general-purpose interaction. It also creates a wedge to expand deeper into a customer’s operations, both via increasingly intelligent workflow tools and more advanced robotics solutions. In addition to inspection, categories like defense & security and construction represent prime areas for vertical solutions to create value. 

Vertical robotics opportunities also force us to consider whether emerging companies or incumbents are best placed to succeed. Despite the massive amounts of capital invested in recent periods in logistics and warehouse robotics, outcompeting Amazon, which has famously externalized many of its cost centers into massive businesses to the detriment of venture-backed competitors, is a tall order. Likewise, consumer distribution and brand advantages held by companies like Amazon and Meta place most new companies at a significant disadvantage.

The Interplay Between RFMs and Vertical Solutions

We also believe there is significant potential for interaction between companies at the two ends of the curve; e.g. Gecko integrating a model from Physical Intelligence. Vertical solution providers can become valuable data partners for foundation model platforms, providing real-world interaction data from high-value use cases. Foundation model platforms, in turn, can help vertical solutions expand their capabilities without massive R&D investment in core robotics intelligence.

July

Thesis

Frontline Audio and Video

Next-generation platforms that combine AI-powered language understanding with advanced audio-video capture are set to revolutionize frontline work by transforming raw field data into trusted, industry-wide operating systems.

December 2024

Industry Transformation

Only a few years ago, while touring the new maintenance facility of a publicly traded aerospace company, an executive pointed out several innovations: automated tool checkout, more advanced safety equipment, and a (physical) file room located closer to the operating floor than ever before. That this last feature was included is telling. The feedback loops between frontline action and data input are central to the operations of many industries – manufacturing, policing, and trade services of all varieties (from plumbing to solar installation). Key elements like pricing estimates, project timing, and resource requirements are often functions of what workers are observing in the field or on the factory floor. 

Despite comprising the majority of the global workforce, frontline workers have been largely left behind by technological transformation. The inefficiencies are stark: law enforcement officers spend up to four hours per shift on documentation, with 96% reporting these demands keep them from core duties. In 2021, nearly ¾ of frontline workers were still using paper forms. But workers are ready to adopt new solutions. In manufacturing 93% of workers believe software tools help them perform better and 96% would be willing to accept increased data monitoring in exchange for benefits like improved training and career development.

The convergence of several forces is creating an unprecedented opportunity to reshape frontline work and fundamentally change how operational knowledge is captured and leveraged. Advances in language understanding mean systems can now adapt to how workers naturally communicate, uncovering deeper context without forcing rigid input structures. Improved video processing and computer vision adds meaning to streaming footage, while ubiquitous mobile devices and sensors enable both active and passive capture (which also contributes to a safer –, hands-free, eyes-up – working environment) . The maturation of retrieval-augmented generation (RAG) technology makes it possible to connect this unstructured frontline data with existing knowledge bases – from maintenance manuals to captured tribal knowledge – creating powerful feedback loops between observation and action.

The winners in this space will build trust by solving acute pain points in documentation and training, then expand to become essential operating infrastructure for their target industries. We see distinct opportunities across market segments. For SMBs – independent trades (“America’s new millionaire class”), farms, medical practices – these solutions can function from day one as a sort of COO and assistant, both improving operations and increasing enterprise value by making tribal knowledge transferable in eventual exits. For larger companies with field forces, manufacturing operations, or driver fleets, these tools accelerate training, surface best practices, and build operational continuity.

In both cases, we believe frontline audio and video capture will serve as the data wedge to become the system of record and intelligence for entire operations. Winners will need vertical focus – the needs of a solar installer differ meaningfully from those of a manufacturer or farmer. Trust and deep industry understanding are critical, as these companies will increasingly look to serve as the automated action layer for their customers, with business models that reflect the value they create (i.e. outcome-based pricing). The platforms that successfully capture and leverage frontline insights won't just become systems of record for individual companies – they'll emerge as the operating systems for entire industries, fundamentally reshaping how skilled frontline work gets done.

July

Thesis

Precision Wellness

Better health outcomes—delivered at lower costs and with greater accessibility—are fundamental to economic growth and human flourishing.

December 2024

Fundamental Consumer

Better health outcomes—delivered at lower costs and with greater accessibility—are fundamental to economic growth and human flourishing. Preventative healthcare represents our largest lever to unlock better outcomes at scale. However, the centralized control, opaque incentives, and high friction that characterize today’s healthcare system hold back progress. It is not built with technological advancement in mind and fails to meet the standard of experiences consumers have elsewhere. 

As the prevailing model fails to evolve, a new paradigm— precision wellness—is emerging. This transformation mirrors the forces that transformed media, finance, and commerce by redistributing the power over the experience to individuals. From top-down institutional mandate to bottom-up iteration, from one-size-fits-all solutions to hyper-personalization, from controlled to in control.

The wellness-driven consumer is at the center of this shift. Motivated by the same “divine discontent” that has continuously sparked consumer innovation across the economy, their demands for scientific rigor and an elevated user experience are accelerating the growth of the precision wellness opportunity. 

  • The next phase of GLP-1 adoption, perhaps the most important catalyst of this overall opportunity, appears increasingly driven by consumer-centric companies; 

  • The vast array of cheap, passive sensors integrated into phones, watches, and headphones creates longitudinal data that was previously unavailable, while clinical-grade modalities on consumer devices build trust in health-focused technology and reorient expectations toward continuous, rather than episodic, monitoring and intervention; 

  • The "mainstreaming of biohacking" is evident in the adoption of CGM among non-diabetics, the growth in advanced biomarker testing, whole genome testing, full-body MRIs, and the increasing demand for personalized, science-driven health optimization protocols.

As more people experience the feedback loops of better health and deeper health understanding – for themselves and those around them – their engagement compounds. This flywheel effect, combined with traditional healthcare's eroding monopoly on trust and access, creates a strong why now for emerging companies capable of integrating science, technology, and brand building. 

We also recognize that precision wellness has a significant blast radius effect, with aggregators, especially Apple, at the center. Data gravity, vast resources, and an incentive to commoditize complementary solutions make it unwise to compete directly. Thus, we are most interested in companies building non-device-centric models for distributed discovery, diagnostics, and delivery. This includes:

  • Next-gen healthcare providers integrating novel diagnostics and data collection into full-service care delivery (going beyond simply digitizing traditional models).

  • Knowledge networks (content + community + coaching) that use personalized insights to help guide users through specific niches of their precision wellness journey, creating a layer of trust in a consumer area that can be overwhelming due to a low signal-to-noise ratio.  

  • Companies using biological insights, often via at-home testing modalities, as a wedge to build up proprietary data sources, trusted brands, and communities.

July

Thesis

Energy Grid Data

I mean, data is in vogue now, and people
are really kind of a bit obsessed with data and
data companies.

November 2024

Industry Transformation

As we move to more renewable energy production and electrifying our consumption, the problems we must solve to modernize the grid are becoming more complex. This need is further amplified by strong increases in demand associated with increased data centers for AI. High-quality data is a crucial infrastructure to ensure that we understand the electric grid and make the most impactful decisions when operating it and investing in it. We believe there is substantial value in unlocking access to such quality data, from avoiding grid outages due to overload to increasing the ROI on making maintenance and new investment decisions.

At the same time, there are substantial issues associated with accessing quality data on the U.S. power grid: 

Fragmentation
The grid is divided into regional entities, such as the Eastern, Western, and Texas Interconnections, managed by various utility companies, independent system operators (ISOs), and regional transmission organizations (RTOs). 

Lack of Standardization
This fragmentation leads to diverse data sources and inconsistent reporting practices, making compiling comprehensive, high-quality data​ difficult.

Non-centralized energy sources
Additionally, the rise of distributed energy resources (DERs) like solar panels and electric vehicles adds complexity. Data on these resources is often fragmented and incomplete, complicating grid balancing and forecasting efforts​.

Privacy and security
Concerns around this restrict access to detailed grid data, as releasing such information could expose vulnerabilities to potential threats​.

While several initiatives (e.g., NREL, IEA) by various government agencies and NGOs to address the abovementioned challenges have been underway, none have impacted the market where easy open data access has been achieved. 

Therefore, we see a unique opportunity in a dedicated effort to aggregate the various data sources and make them available in a standardized format via API and application. The availability of such data can be the underpinning for a wide array of new applications and use cases that require such data (e.g., applying Reinforcement Learning-based optimization to the grid) and can be substantially improved if such data exists. In short, we see an exciting opportunity for the company that can aggregate and maintain the highest quality grid data to be the nexus of an emerging ecosystem.

July

Thesis

Nature Intelligence

We have been inspired by the field of digital bioacoustics ever since being introduced to this field through Karen Bakker’s work.

November 2024

Infrastructure Tech

We have been inspired by the field of digital bioacoustics ever since being introduced to this field through Karen Bakker’s work. We believe there are a few factors that drive the emergence of this field. For one, sensors are becoming smaller and cheaper while edge processing and memory capabilities increase. The broadened availability of these sensors has led to an increase in domain-specific physical world data – a recurring theme in categories we get excited about – that can be coupled with complementary data sources. Coupled with algorithmic breakthroughs, this data can be used in a host of interesting cases: 

  • Biodiversity monitoring: We believe that biodiversity is a crucial cornerstone of a climate-resilient ecosystem and world. Tracking biodiversity in a cost-effective and accurate way has a clear ROI for a host of different shareholders. Bioacoustics augmented with different data sources seems to be an attractive way to achieve this. We see an opportunity to create an objective standard around this kind of data that can be critical to unlocking the emerging commercial ecosystem.

  • Optionality in collecting novel Nature Data: As we collect more data about our ecosystems, we will see emergent use cases for this data. 

    • We see a world where enough data on ecosystems is collected so that we can predict the trajectory of an ecosystem and take the measures/actions to maintain it. Potentially, this could enable the fast regeneration or creation of novel and healthy ecosystems from scratch.

    • Building more sophisticated bioacoustic models can allow us to develop a more granular understanding of the natural world (e.g., tracking the healthiness of individual plants or animals vs. entire ecosystems), which will drive novel use cases in agriculture and beyond.

    • We have been excited about human-to-animal communication for a while and have been following the work that organizations like the Earth Species Project are doing. While concrete use cases will likely only emerge as we develop these models and understand their capabilities and limitations, proven applications such as navigating bees and deterring elephants from entering farms already show promising signs of impact and ROI.

    • As followers of the Santa Fe Institute, we are convinced that interdisciplinarity in building complex systems is conducive to human advancement. Developing a deeper of nature’s complex ecosystems to inspire our manmade systems in novel ways holds a significant upside. This is the core thesis behind our investment in Sakana AI.

    • We see the potential for bioacoustic data to resonate with consumers. For example, users could listen and interact with ecosystems (e.g., their local forests).

We see an exciting opportunity in an orchestrated commercial effort to bring the research from recent years into the field and deepen our understanding of nature and the positive upside that comes with that.

July

Thesis

AI Movie Workflow Suite

AI video content creation will likely diverge into two paths.

November 2024

Industry Transformation

AI video content creation will likely diverge into two paths: high-quality productions that capture and create wider cultural moments, and lower-quality, personalized content. Consumers are expected to value both types, making tradeoffs between production quality and personalization based on their needs.

High-Quality AI-powered Content – We believe that world-class creative talent is attracted to tools and places that enable them to realize their creative ambitions. Given AI's economics and possibilities in the creative process, it will become an indispensable tool for the best creators. We appreciate that AI models today cannot, on a standalone basis, generate world-class content on par with Hollywood-grade. We believe that the foreseeable future will require holistic tools that enable outstanding creative talent to tell great stories with captivating visuals. Therefore, we see a unique opportunity to marry the capabilities of the most advanced AI models (across relevant layers) with an interoperable software and workflow suite. 

We believe there is substantial economic value and options associated with successfully building out such a suite:

  • An AI-powered suite can wedge its way into a software market that has seen little innovation. As talent makes the availability of such solutions a key choice in who to work with (e.g., studios), most major studios will likely have no choice but to adopt the solutions demanded. If played correctly, such an AI-enabled suite can replace existing tools and, over time, set new standards.  

  • We see opportunities to selectively go end-to-end and enable the build-out of a full-stack AI-enabled movie studio/production company. 

  • We see substantial opportunities to expand into other mediums (e.g., gaming).

Low-Quality AI-powered Content – On the other side of the spectrum is lower-quality, highly personalized, rapidly produced content that can be generated by small creators and, ultimately, by the user (either actively or passively based on preferences). This will not require dedicated workflows with large consumer aggregators of consumer(e.g., Netflix, Meta, YouTube, etc.) but instead will be captured by companies uniquely positioned to democratize easy access to video generation models, automated content aggregation, and distribution.

From a venture perspective, we are especially excited about the opportunity associated with the former but believe there will be large companies built in the latter where emerging companies can identify and engage high-value niches that fall outside the core focus of existing platforms (e.g. sports).

July

World View

Consumer AirSpace and Bubbles

There is a palatable sense that we are in for a major disruption of the way we currently spend our time and money.

October 2024

Fundamental Consumer

Working Thesis
There is a palatable sense that we are in for a major disruption of the way we currently spend our time and money. There are a few underlying trends (some of them might appear at odds with each other):

Consumers are increasingly living and consuming in two spaces that are drifting apart: 

Increasingly homogenous AirSpace
Globalisation and innovations in mass production and marketing gave rise to global consumer brands and the first wave of a globally flattened culture. The internet put this on steroids - the same memes, music, and clothes are available almost instantly everywhere. The experience economy, initially a backlash against this homogenisation, has been commoditised. Uber from the airport to your similarly designed Airbnb, whether in Miami, Mumbai or Marakesh. Scale wins, and to achieve that scale you have to work with social media and search engine algorithms, which tend to surface the most mainstream goods and content (because it is the least risky and most profitable), thereby reinforcing that mainstream for consumers. The same is happening in film, where studios are increasingly focusing on mainstream features. We use the term AirSpace coined by Kyle Chayka for this phenomenon of increasing homogeneity.  

We expect the emergence of generative AI to further reinforce the unification of mainstream content. By definition, these algorithms probabilistically create the type of content they are expected to develop based on their training data. As the cost of creating generative content comes down, this will create massive amounts of predictable content that fits squarely into AirSpace and lacks the unexpected. 

Increasingly Heterogenous Personalized Bubble
At the other end of the spectrum, there is a strong trend towards individualised content consumption. Due to the abundance of on-demand content (e.g. Spotify, Netflix), there is a shift towards consuming content on demand and in a highly personalised way. While there are benefits to this type of content consumption, it also makes the content that each of us consumes predictable, as our individual consumption preferences are understood and reinforced by recommendation algorithms. 

As a result, our shared cultural fabric, which is an important medium through which we connect with each other, is being eroded. For example, in its final season in the late 90s, Seinfeld was consistently the number one show on television, averaging 22 million viewers per episode, who watched the episode simultaneously and discussed it in the office the next day. In 2023, the most watched show was Suits, which premiered in 2011 and had its final season in 2019 - we saw it come up in zero conversations in 2023.

We expect this to increase as AI-generated content becomes increasingly viable. We see a not-too-distant future where content across all media and potentially all levels of quality is created for an audience of N of 1, highly tailored to each individual's preferences. 


What we believe to be true about the human psychology and sociology
People like trends and the comfort they bring. So AirSpace is not bad and will continue to exist. However, there is likely to be little room for innovation; large aggregators exist (e.g. Meta, Google, Airbnb) and will continue to monetise this in the best possible way.

Humans like to consume the content they enjoy, and that reinforces their bubble. The more personal, the better. Hence, the Personalized Bubble is not bad. We expect this to get much weirder from here as application developers and consumers lean into AI-powered use cases. Character AI was chasing this opportunity, but a team of former Google researchers was unlikely to embrace the weirdness. 

People like to consume authentic, unique things. However, much online content lacks authenticity/quality/care and is predictable. Gen AI is the straw that breaks the camel's back as the cost of content creation trends towards zero (or the cost of computing). 

As a result, there has been a noticeable shift in how large parts of our digital lives are moving either to group chats (which can act as a curation layer for the noise) or back to IRL in the case of dating (e.g. running clubs in NY or supermarkets in Spain). We also see this shift playing out beyond content and relationships. We believe that people have an innate desire to consume goods that others have put care into and that are unique. As this type of content becomes less present/prominent online (e.g., due to Gen AI), we expect to see a big shift towards people consuming physical goods and experiences that have this artisanal aspect, are unique or ephemeral, such as pottery, handmade clothing, leather goods, live concerts, etc. This is great for brands like Hermes, which have kept craft at the heart of their leather business. It's also great for live performing artists (and their ecosystem), local artisans, etc. 

Humans crave shared cultural experiences. Unexpected and rooted in whatever shared cultural fabric is left, these experiences must shatter the confirmatory AirSpace and transcend our personalized Bubbles. Achieving this in a repeatable fashion requires a deep understanding of the Zeitgeist and the ability to turn this inside out in unexpected ways that deeply resonate with a society's (or sub-groups) shared cultural fabric. 

Opportunity Areas
Substantial innovation will occur in the context of AI-enabled personalized experiences. We are excited about this opportunity and are looking for companies exploring envelope-pushing form factors and ideas that are borderline fringe today.

As the Airspace and the Bubbles continue drifting apart and becoming more homogeneous on the one hand and heterogeneous on the other end, there will be substantial value in creating these types of experiences in a repeatable fashion. Studios like MSCHF and A24 have done a great job of this.

July

Thesis

Intelligence-Enabled Marketplace

We see an exciting opportunity for AI-enabled marketplaces to emerge.

October 2024

Infrastructure Tech

Working Thesis

We see an exciting opportunity for AI-enabled marketplaces to emerge. While there are a lot of opportunities for AI to enhance marketplaces (see good NfX write-up here), we are especially interested in situations where AI-enabled processes are in a reinforcing interplay with data advantages that lead to a sustained higher value proposition (i.e., better matching) in the marketplace (see graph below).

As outlined above, there are two interconnected feedback loops at play: 

  1. Using LLMs and VLMs to collect the right proprietary data at scale (i.e., conduct interviews, ingest large documents, understand client requirements using document upload, etc.).

  2. Use fine-tuned LLMs/VLMs + other ML models to understand demand and supply better, identify actions that reduce uncertainty around matching probability (e.g., follow-up questions) and to carry these actions in service of enabling more cost-effective/higher-value matching.  

We expect that businesses creating sustained value in this space to meet the following criteria:

  1. LLMs, VLMs, and other models can perform tasks to an acceptable degree (i.e., they meet a bare minimum threshold) – both on the proprietary data collection and matching side.

  2. Large amounts of unstructured data and feedback loops are useful for fine-tuning models that directly unlock economic value.

  3. Nobody has collected data relevant for training/finetuning these models at scale as there has been no economic/technological incentive to do so.   

  4. There are ways to create highly frictionless form factors using 1.)  that allow users to interact with these platforms seamlessly and in highly personalized ways to collect large amounts of data. 

  5. Initial data and model advantages can be sustained and turned into lasting moats with little risk of second movers and other market participants (e.g., incumbents with large distribution) being able to catch up. 

We see opportunities in various areas, from HR to Traveling to healthcare provider (e.g., psychologist) matching. Especially in scenarios where a lack of information leads to low matching rates. A few cases:

Case Study 1: Staffing

Staffing is historically incredibly time-consuming, requiring a deep understanding of a candidate’s capabilities and the job requirement assessment. This is very hard to scale as quality assessment usually requires 1) reviewing materials, 2) conducting interviews to dig deeper + review these, and 3) feedback cycles to understand what type of candidates the demand side actually wants (stated vs. revealed preferences). This leads to many staffing marketplaces doing a bad job of vetting demand or being very expensive, with matching rates reflecting this. 

Let’s go through the criteria set up above to see whether a hiring marketplace is a good fit to become intelligent:

  1. LLMs can already review and synthesize vast amounts of unstructured data (e.g., CVs, websites). They are capable of doing the same with job requirements. They are also capable of performing job interviews to a somewhat satisfactory level. 

  2. Models and AI interviews can be finetuned based on desirable outcomes (e.g., matching of demand and supply), thereby adjusting their reviewing and interview capabilities. This can happen even in a customized way (e.g., custom), given that certain parties on the demand side are large enough to guarantee a certain “offtake.” Mercor wrote this on their blog:

  3. This part is not so clear in the context of staffing. For one, there are a plethora of existing and new AI-enabled hiring tools that use AI-supported video (e.g., HireVue), and existing staffing platforms (e.g., Upworks) are rolling out video interviews, too. It is unclear to what extent these platforms might or might not have large amounts of a combination of unstructured data with hiring matches that they can use to train better models. Also, by sheer scale and distribution, these platforms should be able to generate plenty of data easily. 

  4. In the segments of the economy where jobs are sought after, people are eager for the opportunity to be in the talent pool that is considered for specific jobs. In these cases, people are willing to share their data CVs and conduct AI interviews – especially if the process is smooth. Given that the demand side (aka the companies looking to hire from the talent pool) is reasonably attractive, the CAC associated with acquiring the supply and data (i.e., video interviews, CVs, etc.) should be fairly low. 

    As described above, while we don’t assume AI-based matchmaking to be perfect yet, we believe that AI can be used to support already increasingly efficient matching, enabling the development of a cash-flow-generating business model while data is collected and models improve.

  5. Given the dynamics described under 3, it is unclear whether an HR marketplace with an initial data advantage can sustain this advantage. What if existing platforms like Upwork roll out AI-based video interviews and start training their models? With their existing brand and supply, they should be able to generate more data than any startup substantially faster, leading to better models, etc. If not, what is a relevant quantity of data to establish a platform as the winner? Will general LLMs acquire the capabilities of finetuned models as they get better and context windows improve?

July

Thesis

Sustainable Construction

Construction is one of the world’s largest industries.

September 2024

Industry Transformation

Construction is one of the world’s largest industries. Global construction spending in 2023 amounted to some $13 trillion, 7% of global gross output. It is also one of the most unproductive sectors of the economy. Tight labor markets, regulatory complexity, and systemic fragmentation along with cultural inertia have contributed to stagnation and a lack of technological penetration

This ineffectiveness does not discriminate by project size or scope. While nearly everything we touch and consume is produced in mass quantities, factory-produced homes still make up a small percentage of the overall new housing stock. Meanwhile, 98% of mega-projects experience cost overruns of 30% or more, and 77% face delays exceeding 40%. The impacts on broader economic growth are significant. Had construction productivity matched that in manufacturing over the past 20 years, the world would be $1.6 trillion – 2% of GDP – richer each year. Increasing pressure to decarbonize places additional stress on the low-margin, change-resistant industry. Through both operations (28)% and materials/inputs (11%), buildings account for nearly 40% of global emissions.

These supply-side deficiencies come against a backdrop of rapidly expanding demand – by 2040, the industry needs to expand production capacity by 70%+. This is creating a desperate, and long overdue, search for answers that we believe can only be met by a combination of technological innovation and novel production and business system design. 

While prior attempts to transform construction – most notably Katerra – have failed, several factors are converging to create a more compelling why now moment. Novel materials like green steel and low-carbon cement are approaching commercial viability, while mass timber innovations make building faster and less wasteful – while delivering significant carbon sequestration. Construction robotics focused on autonomous assembly, logistics, and data capture can address the labor gap. Perhaps most importantly, advances in generative design and AI-powered collaboration tools can help target the small but critical coordination inefficiencies that have historically bottlenecked progress – precisely the type of system-wide improvements that Amdahl's Law suggests are essential for meaningful transformation.

We believe the companies that capitalize on this moment will do so through one of two models. The first is selective vertical integration – controlling critical capabilities in materials, design, and manufacturing, but executed with greater focus and patience than previous attempts. The second is a platform approach that centralizes key material and system design and standardizes interfaces between stakeholders while allowing specialized players to focus on their core competencies – similar to how semiconductor manufacturing evolved.

Both models recognize three essential elements that must work together: First, standardized approaches to next-generation materials that maximize both assembly efficiency and carbon benefits, from green steel to mass timber. Second, digital infrastructure that enables true system-wide optimization and seamless stakeholder coordination. Third, controlled manufacturing environments that bring automotive-style productivity to strategic components, whether owned directly or orchestrated through a network of partners.

July

July

article header

November 2024
Infrastructure Tech

Confidence: nascent emerging established

Thesis

Composable Companies

A new organizational model is emerging: the composable company - organizations that blend permanent infrastructure with fluid product capabilities. At their core, they maintain:

  • Capital and resource allocation expertise

  • Shared technical infrastructure

  • World-class talent

  • Possibly, Strategic customer and distribution relationships

By centralizing these unique capabilities, composable companies can swiftly identify, validate, and scale opportunities across its chosen markets. Around this foundation, teams can be rapidly assembled and reconfigured to pursue specific missions/product opportunities with various time scales.

This model excels in markets where opportunity spaces are in flu, and an organization needs to have flexibility and bandwidth to build out ideas that compound together around a coherent view of the future, but might find their manifestation in distinct products for distinct customers.

Recent developments in AI further enhance this model's viability by enabling more cost-effective creation of software and supporting customization for specific use cases:

  • Reducing software development costs

  • Streamlining maintenance requirements

  • Improving customer support efficiency

  • Enabling more cost-effective creation of AI tools

The Resulting Structure

The end product could be a holding company-style enterprise that combines:

  • The above-described core infrastructure

  • Multiple AI product and tools with varying scale and durability

This structure enables the efficient pursuit of numerous opportunities while maintaining the potential for further asymmetric returns from breakthrough successes among them or in aggre


July