The AGI-le Investor
12 September 2023·3 min read

Meta's Llama and the Open-Source AI Disruption

Open Source AIMetaAI ModelsInvestment Strategy
LN Sadani

LN Sadani

Chief Executive Officer, Lensbridge Capital

When Meta released Llama 2 in July 2023 — a family of large language models available for free commercial use — it triggered a debate about the economics of AI that has not been fully resolved. The question is straightforward: if capable AI models are freely available, where does the value in the AI industry actually accrue? The answer has significant implications for investors who have been building positions in AI companies on the assumption that proprietary model capabilities are a durable competitive advantage.

The open-source AI movement predates Llama 2 — models like GPT-J, BLOOM, and the original Llama had been available to researchers for some time. What Llama 2 changed was the quality threshold. For the first time, an open-source model was capable enough to be deployed in commercial applications without significant fine-tuning or modification. This meant that a startup or enterprise could build an AI-powered product using a free foundation model, rather than paying for API access to OpenAI or Anthropic. The competitive moat of the frontier model providers had been, if not eliminated, substantially narrowed.

The investment implications flow in two directions. For application-layer AI companies — those building products on top of foundation models — the open-source moment is a double-edged sword. It reduces the cost of building AI products, which is good for margins. But it also reduces the differentiation that comes from using a superior proprietary model, which is bad for competitive positioning. The companies that will win in this environment are those that have built proprietary data assets, distribution advantages, or workflow integrations that are independent of the underlying model.

For infrastructure investors, the open-source moment is unambiguously positive. Open-source models require the same compute infrastructure as proprietary models — in some cases more, because fine-tuning and customisation require additional training runs. The democratisation of AI capability through open-source models expands the universe of organisations that can deploy AI, which expands the demand for the infrastructure that supports it. At Lensbridge, the Llama 2 release reinforced our conviction that the infrastructure layer is the most durable part of the AI value chain.