Bret Taylor stated that companies attempting to train their own large language models (LLMs) will likely incur excessive costs. He advised against this approach, recommending to build services rather than new models unless working with major firms like OpenAI, Anthropic, Google, or Meta. He argued that the capital required prevents smaller companies from entering the market, leading to consolidation. Notably, some international firms, like DeepSeek, have managed to create competitive models with lower costs. Taylor emphasized the impracticality for startups trying to develop their own LLMs.
"There’s ones you can lease, there’s open source ones. Don’t do it." Taylor emphasized the impracticality of developing new AI models unless one belongs to a major player in the industry.
"Unless you work at OpenAI or Anthropic or Google or Meta, you’re probably not building one of those," Taylor asserted, underlining the capital intensity of training frontier AI models.
Training new AI models is a "good way to burn through millions of dollars," according to Bret Taylor, highlighting the financial risks for startups in this space.
Taylor noted that the high costs have impeded the growth of an indie data center market, making it challenging for smaller companies to compete in AI model development.
Collection
[
|
...
]