|
Welcome back! Amazon Web Services and Google Cloud are arch-rivals in the cloud computing market, so it was surprising to hear them announce Monday that they’re working to make it easier for their joint customers to move data and develop applications that run on both clouds. Tech companies are constantly issuing announcements about working together, but this one is different. For years, AWS executives dismissed the trend of companies using multiple cloud providers and insisted customers could get everything they needed on AWS. In a modest concession five years ago, AWS released a tool that helped its customers manage computing jobs running on Google Cloud and Microsoft Azure. What happened Monday was a full-blown reversal. AWS now seems to be totally fine with the idea of customers using other clouds. In addition to working with Google Cloud, AWS says it also plans to work with Microsoft on a similar connection—known as a networking interconnect between data centers—to its Azure cloud next year. This is great news for AWS customers that currently don’t have simple options for also accessing Gemini and OpenAI models through rival clouds. This is no accident. While AWS is still far and away the king of the cloud market, its status as an AI also-ran may have prompted it to make this type of concession. That’s in line with what we reported Monday about how some AWS sellers had continued to pitch Anthropic’s models for complex AI projects because AWS’ own Nova AI models haven’t proven to be a strong enough alternative. Yes, there are a number of mutual benefits to this arrangement, such as the ability for AWS or Google Cloud to serve as a backstop for customers’ applications in case the other provider suffers an outage, such as a big one that happened recently to AWS. And the companies say the networking bridge will save customers the time and expense of piecing together cross-cloud connections on their own. But another reason AWS is working with Google Cloud this way is because many AWS customers are skipping Amazon’s Nova models and instead using Google’s Gemini and OpenAI’s full range of models, both of which are not available on AWS. As we reported this summer, some AWS customers were using Google’s Gemini for wide-ranging tasks and decreasing their spending on Bedrock, AWS’ service for using AI models. Some customers are even moving data they store in AWS to Google Cloud in order to use the Gemini models, according to a former AWS manager. If this were to continue, it could erode a competitive advantage AWS gets from being its customers’ primary data storage cloud, the person said. The well-received launch of Gemini 3 last week could theoretically increase the likelihood of such a scenario. The networking bridge addresses this issue by letting AWS customers use Gemini models through a secure connection without having to move their data to Google Cloud, the former AWS manager said. It’s hard to say how the AWS-Google Cloud relationship will evolve over time, but the networking bridge could pave the way for Gemini models to someday run on AWS, perhaps on the Bedrock AI app-building service. Amazon itself has long prided itself on offering customers the widest range of choices for products, so it would make sense for AWS to bite the bullet at some point in the future and make it easier for its customers to access Gemini models. That’s assuming Google would want it too. AWS’ Full-Court Press on AI at re:Invent Amazon Web Services, at its annual re:invent customer conference, is going all out to show that it is not behind in AI, unveiling new in-house AI models and launching new cloud servers powered by its latest custom AI chips, Trainium3. The big reveal was Amazon Nova 2 Omni, a new AI model that can process text, speech, images and video, confirming our report from Monday. AWS CEO Matt Garman said Omni could take the audio, images and video from his keynote and produce a detailed summary of all the information that was presented—a task that would typically require using multiple different AI models. Omni looks like a step to provide an AI model that can handle a wide range of tasks, which could help AWS better compete with Google’s Gemini and OpenAI’s GPT-5.1 models. But AWS also looked poised to be more competitive with these firms after launching its Nova line of models at last year’s re:Invent, which hasn’t happened, as we covered in our story from Monday. AWS also announced that its first cloud servers powered by its new Trainium3 chip are available to customers. But aside from Decart, an startup developing large AI models, AWS didn’t name any other Trainium3 customers, although it did say that customers are using the new chip for computing jobs on Bedrock. OpenAI Steps Into the AI Roll-Up Game We’ve written at length about how generative AI’s promise of automating the operations of traditional businesses is catnip to private equity funds. The proposition of reducing labor costs at services businesses such as accounting firms is so appealing that everyone from VC fund General Catalyst to former Microsoft deals chief Chris Young has been dipping their toes into this field, often referred to as AI roll-ups. Now OpenAI is entering the market. It’s taking a stake in Thrive Holdings, an investment firm founded by venture capital firm Thrive to buy accounting and IT companies and make them more efficient with AI. OpenAI researchers will work with engineers at Thrive Holdings to “deeply integrate AI into the businesses that we own and operate,” the companies said Monday. In the near term, it seems the arrangement could help OpenAI expand its base of corporate customers as it competes fiercely with Anthropic for recurring subscription revenue from businesses. It could also help OpenAI learn from a set of customers that it essentially owns, giving it a fertile ground to figure out what works—and doesn’t work—in enterprise-based AI as it looks to woo other corporate customers. The move also fits with a growing trend we’ve seen in AI, where AI developers are increasingly willing to get hands-on with customers in order to make sure their automation projects actually work as intended. OpenAI has been doing as much with large customers such as the Pentagon, while Anthropic has similarly been flying out its engineers to help customers such as Cox Automotive get their AI pilots off the ground.
|