BingChat is so hungry for GPUs that Microsoft will lease them from Oracle

Apparently the demand for Microsoft’s AI services is so high, or Redmond’s resources so short, that the software giant plans to use some of the machine learning models used by Bing Search as part of a multi-year deal announced Tuesday. Load Oracle GPU supercluster.

“Our partnership with Oracle and the use of Oracle Cloud infrastructure along with Microsoft Azure AI infrastructure will increase reach for customers and improve the speed of many of our search results,” explained Divya Kumar, head of Microsoft’s search and AI marketing team. . In a statement.

The partnership basically boils down to this: Microsoft needs more computing resources to keep up with the “explosive growth” of its AI services, and Oracle has tens of thousands of Nvidia A100 and H100 GPUs available for lease. It’s safe to say that the database giant founded by Larry Ellison doesn’t have enough cloud customers to consume its silicon reserves.

Microsoft was one of the first to integrate an AI chatbot into its search engine with the launch of Bing Chat in February. You all know the drill by now: you can enter requests, requests, or queries into Bing Chat, and it will try to dig up information, write bad poetry, generate photos and other content, etc.

The large language models that underlie the service not only require massive clusters of GPUs to train, but also to infer the process of running a model to run at scale. It is the Oracle GPU suite that helps with this inference task.

The latest collaboration between the two cloud providers leverages Oracle Interconnect for Microsoft Azure, which allows services running in Azure to interact with resources in Oracle Cloud Infrastructure (OCI). The two cloud giants previously used the service to allow customers to connect workloads running in Azure to OCI databases.

In this case, Microsoft is using this system alongside its Azure Kubernetes service to orchestrate Oracle GPU nodes to keep up with what it says is demand for Bing’s AI features.

For October 2023, Bing had a 3.1 percent share of the global web search market for all platforms, compared with Google’s 91.6 percent, but up from 3 percent the previous month, according to StatCounter. On desktop, Bing climbed to 9.1 percent and for tablets 4.6 percent.

Maybe StatCounter was wrong. Maybe Microsoft’s chat search engine isn’t as wildly popular as we think. Maybe Microsoft just wants to make Bing look like it’s in high demand. Maybe Redmond really needs the extra calculations.

Oracle claims that its cloud superclusters, which Bing will likely use, can scale to 32,768 Nvidia A100 GPUs or 16,384 H100 GPUs using Network Direct Access (RDMA). It is backed by petabytes of high-performance clustered file storage designed to support highly parallel applications.

Microsoft hasn’t said how many Oracle GPU nodes it needs for its AI services and applications, and it won’t say. A spokesperson told us: “These are not details we are sharing as part of this announcement. We’ve also reached out to Oracle for more information and will let you know if we hear back.

This is not the first time that enemies have relied on each other for help. In September, Oracle announced that it would place its database systems in Microsoft Azure datacenters. At that time, the goal of this collaboration was to reduce the latency associated with connecting Oracle databases running in OCI to workloads in Azure.

#BingChat #hungry #GPUs #Microsoft #lease #Oracle
Image Source : www.theregister.com

Leave a Comment