Poland – 10/19/2023: In this photo illustration, the Anthropic logo is displayed on a smartphone … (+)
SOPA Image/LightRocket (via Getty Images)
The world’s largest technology companies are now those that are involved in some form of artificial intelligence, whether it’s creating models to deliver products or services, hardware design, web services, or multiple areas of these categories.
Nvidia is the largest company by market capitalization, but the leading model services company is becoming a household name. OpenAI comes out on top, but Anthropic is a runner-up in many ways.
One of the things we heard on the recent AI Daily Brief podcast with Nathaniel Whitmore was the announcement that Anthropic has raised $2 billion in new funding, bringing the overall valuation to $60 billion.
This made both Amodeis (Dario and Daniela) and five other executives at the company billionaires. So who is funding Anthropic this way?
Onboard ships such as Amazon
By most accounts, the biggest contributor is Amazon, which has integrated Anthropic’s Claude model into some of its web services.
Below are resources from Amazon that describe the use of Anthropic’s Claude 3 Haiku.
“Unleash the power to fine-tune Anthropic’s Claude 3 Haiku models using Amazon Bedrock. This comprehensive, detailed demo provides everything from accessing Amazon Bedrock to customizing Claude 3 Haiku to suit your business needs. Guide you through the process. Discover how to increase the accuracy, quality, and consistency of your models by encoding your company and domain knowledge to produce higher quality results for your own users. Learn how to create experiences and improve performance for domain-specific tasks. Don’t miss this opportunity to start tweaking now and unlock the full potential of Claude 3 Haiku.”
The co-branding here is obvious.
Other companies are also participating, including Google and, according to a new report, Lightspeed Venture. In some ways, it makes sense for Amazon, for example, to fund the company in this way based on a partnership.
However, there is some speculation about the overall story behind this deal.
Are you running out of cash? What is a runway?
A recent opinion article in Marketwatch discusses how both OpenAI and Anthropic are spending significant amounts of money on computing and other web services.
Therese Poletti reports that OpenAI is projected to lose $5 billion in 2024, and the company appears to be losing money on ChatGPT subscriptions because people are using up so much computing power .
She points out that Anthropic is following suit and increasing its use of AWS capacity.
Plummeting inference costs
Both OpenAI and Anthropic are spending a lot of money, but some people with industry knowledge say the new type of network will significantly reduce the cost of ChatGPT sessions and other types of AI usage. points out the predictions.
They point to hyperprocesses that follow the outline of Moore’s Law, where computing costs have fallen as hardware has gotten smaller. Citing various types of scaling laws, these experts suggest that the cost of using tokens will drop significantly, some of which will be made possible by multi-agent AI. In other words, individual AI components can work together to deliver unique skill sets.
Returning to Ethan Mollick’s bullish predictions about what robots and AI entities will be able to do in just a few years. You’ll be able to do everything from changing diapers to comforting the sick, preparing meals, and designing greeting cards.
These are my examples, not his, but there is a strong argument that technology will be faster, cheaper, and better in 2025.
Anthropic News is another transit point to a burgeoning industry that becomes even more important as the months go by.