Connect with us

Hi, what are you looking for?

Tuesday, Oct 28, 2025
Mugglehead Investment Magazine
Alternative investment news based in Vancouver, B.C.
Qualcomm gains as market reacts to new AI accelerator chips
Qualcomm gains as market reacts to new AI accelerator chips
Qualcomm snapdragon microprocessor for AI. Image via Qualcomm.

AI and Autonomy

Qualcomm gains as market reacts to new AI accelerator chips

The company introduced two chips, the AI200 and AI250, which it plans to release in 2026 and 2027

Shares of Qualcomm Inc (NASDAQ: QCOM) surged 15 per cent Monday after announcing new artificial intelligence accelerator chips, a major step into the data center market long dominated by Nvidia Corp. (NASDAQ: NVDA).

The move positions Qualcomm as a new challenger in one of the fastest-growing areas of technology, where demand for AI hardware continues to explode.

The company introduced two chips, the AI200 and AI250, which it plans to release in 2026 and 2027. Both can operate in a full, liquid-cooled server rack—matching the format offered by Nvidia and Advanced Micro Devices Inc. (NASDAQ: AMD). Each rack can hold as many as 72 chips that work together as a single supercomputer. AI research labs need that kind of computing power to run and refine large language models such as ChatGPT.

Qualcomm’s new data center chips build on technology from its smartphone processors, specifically its Hexagon neural processing units, or NPUs. The company said that by proving its AI technology in mobile and edge computing first, it gained the foundation to expand into large-scale systems. Durga Malladi, Qualcomm’s general manager for data center and edge products, said the firm wanted to strengthen its technology base before scaling up to the data center level.

The expansion marks Qualcomm’s entry into the heart of the AI race.

Data center equipment has become the backbone of artificial intelligence, powering everything from chatbots to autonomous vehicles. McKinsey & Co. estimates that global spending on data centers will reach nearly USD$6.7 trillion by 2030, with most of that focused on AI-driven systems.

Read more: Qualcomm acquires Arduino to strengthen robotics ties

Read more: Humanoid robots smash each other’s circuits at inaugural Beijing contest

Qualcomm chips focus on running trained AI models

For now, Nvidia controls more than 90 per cent of the AI chip market, thanks to its powerful GPUs that train and run advanced models. Those chips helped push Nvidia’s market capitalization beyond $4.5 trillion, making it one of the most valuable companies in the world. Nvidia’s GPUs were instrumental in developing OpenAI’s GPT models, which power ChatGPT.

However, some major players have started exploring alternatives. Earlier this month, OpenAI said it planned to purchase chips from AMD and could take a stake in the company. Additionally, tech giants like Alphabet (NASDAQ: GOOGL), Amazon (NASDAQ: AMZN) and Microsoft Corporation (NASDAQ: MSFT) are designing their own AI accelerators to reduce dependence on Nvidia.

Qualcomm is targeting a different segment of the market. Its chips focus on inference, which means running AI models after they are trained, rather than training them from scratch. Training requires far more power and data, while inference emphasizes efficiency and cost savings. Qualcomm said its systems will be cheaper to operate for large cloud providers. It will also use around 160 kilowatts per rack, a similar power level to Nvidia’s highest-end GPU racks.

Furthermore, the company said it will offer flexibility to customers who prefer custom solutions. It will sell both complete rack systems and individual components such as CPUs and AI cards. Malladi added that other chipmakers, including Nvidia and AMD, could buy Qualcomm’s parts for their own data center setups.

“What we’ve done is design a system that customers can fully adopt or mix with their own,” Malladi explained.

Read more: Chinese regulators investigate Qualcomm Inc’s Autotalks acquisition

Read more: Chinese scholars harness artificial intelligence to protect endangered monkeys

Qualcomm to supply AI chips to Saudi Arabia company

Qualcomm declined to share pricing details or say how many NPUs each rack can hold. However, the company claimed its AI systems will offer advantages in power efficiency, total cost of ownership, and memory design. Each AI card will support 768 gigabytes of memory, which surpasses the capacities of current Nvidia and AMD cards.

In May, Qualcomm announced a deal with Saudi Arabia’s Humain to supply AI inferencing chips for regional data centers. Humain has committed to deploying up to 200 megawatts of Qualcomm-powered systems—enough to fill multiple large facilities.

The new chips represent a strategic pivot for Qualcomm, which has traditionally focused on mobile and wireless connectivity. Entering the data center market will allow it to compete directly in the infrastructure behind generative AI.

With competitors like Nvidia, AMD, and major cloud providers all racing to capture the same market, Qualcomm’s success will depend on whether it can deliver comparable performance at lower operating costs. If it does, it could give cloud operators and AI developers a long-awaited alternative in an industry that has so far revolved around a single dominant supplier.

.

Follow Mugglehead on X

Like Mugglehead on Facebook

Follow Joseph Morton on X

joseph@mugglehead.com

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

AI and Autonomy

Turbo AI is pulling millions in revenue and has only been around since early 2024

Medical and Pharmaceutical

NFTC president Jake Colvin urged the administration to pursue non-tariff measures that promote competitiveness instead of punitive duties

AI and Autonomy

AI filmmaking is becoming increasingly demanded in Hollywood

AI and Autonomy

Crypto, and particularly stablecoins, is uniquely suited for machine transactions due to its open, digital-native nature