China-based AI vendor Baidu introduced two new AI chips on Thursday.
Baidu announced that it will launch the M100 chip in 2026 and the M300 chip in 2027. M100 is designed for large-scale inference, while M300 is for both large-scale model training and inference.
The introduction of the new models comes months after China’s Cyberspace Administration told local tech companies to stop using Nvidia chips. It also comes amid the ongoing tension between China and the U.S. over trade and AI issues.
Chinese Chip Market
The AI chips released by Baidu will change the Chinese chip market, said Lian Jye Su, an analyst at Omdia, a division of Informa TechTarget. Until now, Nvidia AI chips have dominated in China, as they have globally.
“Traditionally, Chinese enterprises are very familiar with Nvidia’s products,” Su said, adding that some Chinese enterprises are also using chips from Chinese tech giant Huawei. “The market needs a strong alternative, and Baidu can offer that alternative.”
However, despite being an established AI vendor, Baidu has not been successful in bringing its chips to market in the era of generative AI.
“Baidu may not be as strong as Huawei, but they are pretty decent,” Su said. “They have a lot of existing connections and contacts with large enterprises in China.”
He added that this network enables Baidu to sell to numerous enterprises and gives the company influence in the chip market.
Moreover, the vendor is one of a relatively small group of companies that have generative AI models and capabilities, as well as a legacy in traditional AI and machine learning. Other vendors that offer similar capabilities include Google and Huawei, Su added.
“There are not many companies that have the end-to-end full set of technology, and if you couple that with a chipset, it actually gives a pretty good solution to the enterprise,” Su said.
A New Trend
While Baidu’s AI chip launch highlights Chinese-U.S. tensions and the need for Chinese vendors to be independent, it also underscores how AI vendors are becoming more aggressive in launching their own AI chips.
One vendor that is becoming more aggressive about its chip strategy is Microsoft. During a recent podcast interview, Microsoft CEO Satya Nadella said the cloud giant can license intellectual property rights to OpenAI’s AI chip, which it plans to design in collaboration with Broadcom. Microsoft intends first to adopt OpenAI’s designs and then extend them for its own purposes. Microsoft also has its Azure Maia AI Accelerator, which it launched in 2023.
“It does appear that this year is the year where you have hyperscalers start to … be very vocal about what they’re doing” in the AI infrastructure arena, Su said. For vendors in the U.S., the strategy of investing in their own chips stems from a need to diversify their chipsets to reduce their reliance on Nvidia chips, Su said.
However, for Chinese vendors, aggressiveness on the chip front has more to do with the chip shortage in the Chinese market and the intention of the Chinese government to promote more internal AI chipsets, he added.



