How much of the booming generative AI market will disappear under the clutches of the biggest tech companies? And what’s left for the many other companies hoping to cash in on the tech world’s latest craze?
It may be less than five months since ChatGPT’s launch, but those questions already loom large as the biggest tech companies vie to stake big chunks of the field for themselves.
Amazon has been the latest to set up its Generative AI stall through Amazon Web Services Cloud Computing Arm. Along with its own big AI model called Titan, AWS said last week it would offer access to several others on its platform, including big language models from AI start-up Anthropic and open-source ones. steady spread image generation system.
Hosting and delivering such independent AI services is part of AWS’s effort to position its cloud at the center of the next-generation AI market. AWS also supplies all the tools developers need to build, train, and deploy their own generative AI models, and their own specialized tools for both training well and running large machine-learning systems. Designs chips.
It is not alone. This month, Google claimed that supercomputers built with the latest generation of their own chips, called TPUs, have achieved significant levels of performance in training large AI models. Microsoft has also joined the stampede among the biggest tech companies to develop its own specialized chips for AI, according to a senior figure (its plans were first reported by The Information).
Moves like this show just how far big tech companies are moving in their efforts to control all parts of AI’s new computing “stack” — that is, the layers of technology that are needed to support and drive demanding new computing workloads. and transforms them into useful services for the customers.
At the bottom of this stack are chips designed to process the vast amounts of data needed to train large AI models. Other layers include the algorithms and other software needed to train and deploy the system; The language and vision models themselves are known largely as “foundation models” because they serve as a base level of intelligence; And finally, the many applications and services that run on top of these models to shape the technology for specific markets and uses.
Amazon, Microsoft and Google are already staking claim to most of the lower levels of this hierarchy of technology, making it difficult for others to enter a market that requires large scale operations with the lowest unit costs Will happen.
Even Elon Musk, who claims his nascent AI company will be a “third force” in AI against Google and the Microsoft/OpenAI partnership, faces a steep uphill climb. Tesla, his electric car company, has already built an AI computer to handle vision recognition. This week, the irrepressible Musk claimed that selling this technology to others could one day be worth “hundreds of billions”. But it won’t be easy for the world’s biggest language and image model to catch up with tech giants that have already spent years fine-tuning their technology.
The question now is how far up the “stack” do cloud companies try to claim more value from the new technology for themselves.
For those who don’t already have it, control of their own large AI models (or, in Microsoft’s case, a closer alliance with OpenAI) seems a possible goal. The Foundation model costs a significant amount of money to develop and can be put to work on a wide range of applications, making it a natural first step for any big tech company with AI ambitions.
The centrality of these larger models to their broader strategic goals means that companies are unlikely to view them as profit centers in their own right. Imad Mostaq, head of Stability AI, the company behind Stable Diffusion, certainly sees it. He warns of a “race to the bottom” in pricing as big tech companies battle to establish their core AI systems, leaving little room for anyone else.
Instead Mostac is counting on two things. One is that Amazon will always be happy to host rival AI models in its cloud and not try to replace them with its own. The second is that there will still be room for differentiation between AI models, and that not all customers will want to trust vast, opaque systems run by a handful of major tech companies. If he’s wrong, the early, competitive phase of Generational AI may prove too short.