Artificial Intelligence (AI) Energy Consumption Is Jumping At a Scary Pace: 2 Stocks That Could Surge Thanks to This Trend


These two companies are set to solve two major problems arising out of the rapid adoption of AI.

The proliferation of artificial intelligence (AI) has increased the demand for more powerful chips that are being deployed in data centers to train complex large language models (LLMs), and also for moving those models into production through AI inference.

However, clustering together multiple powerful chips that consume a lot of electricity and generate a lot of heat also means that data centers now have two new challenges to tackle. The first is to find a way to reduce electricity consumption. Market research firm IDC anticipates that energy consumption in AI data centers is set to increase at an incredible compound annual growth rate of 45% through 2027.

The firm predicts that overall data center electricity consumption could more than double between 2023 and 2028. Meanwhile, Goldman Sachs forecasts that data center power demand could grow 160% by 2030, indicating that data center operators will have to shell out a lot of money on electricity.

The second problem that AI data centers are creating is that of higher heat generation. When multiple chips with high power consumption figures are deployed in AI server racks, it is inevitable for them to produce a lot of heat. Not surprisingly, there are concerns that AI data centers could have a negative impact on the climate and create more pressure on the electrical grid.

However, there are two companies that are looking to solve these challenges — Nvidia (NVDA 3.13%) and Super Micro Computer (SMCI 2.07%) — and check how their products could witness a nice jump in adoption to tackle the problem of rising heat and electricity generation in data centers.

1. Nvidia

Nvidia’s graphics processing units (GPUs) have been the chips of choice for AI training and inference. This is evident from the company’s 85%-plus share of the AI chip market. Nvidia’s chips have been deployed for training popular AI models such as OpenAI’s ChatGPT and Meta Platforms‘ Llama, and cloud service providers have been increasingly looking to get their hands on the company’s offerings to train even larger models.

One reason why that’s happening is because Nvidia’s AI chips are getting more powerful with each passing generation. For instance, the chip giant points out that its upcoming Blackwell AI processors allow organizations “to build and run real-time generative AI on trillion-parameter large language models at up to 25x less cost and energy consumption than its predecessor.”

More importantly, this remarkable reduction in energy consumption is accompanied by a 30 times increase in performance. So, AI models can not only be trained and deployed at a much faster pace now using Nvidia’s chips but the same can now be done with much less power consumption. For example, Nvidia points out that its Blackwell processors can train OpenAI’s GPT-4 LLM by consuming just 3 gigawatts of power as compared to a whopping 5,500 gigawatts which would have been required a decade ago.

As such, it won’t be surprising to see Nvidia sustaining its lead in the market for AI chips as its processors are likely to be in high demand because of the cost and performance advantages. That’s the reason why analysts at Japanese investment bank Mizuho are forecasting Nvidia’s revenue to surpass $200 billion in 2027 (which will coincide with its fiscal year 2026).

That would be more than triple the company’s fiscal 2024 revenue of $61 billion. More importantly, Mizuho’s forecast indicates that Nvidia could easily surpass Wall Street’s estimates of $178 billion in revenue for fiscal 2026. As a result, Nvidia stock’s impressive surge seems sustainable, which is why investors would do well to buy it while it is still trading at a relatively attractive valuation.

2. Super Micro Computer

Server manufacturer Supermicro has received a lot of negative press of late. From a bearish report by short-seller Hindenburg Research alleging financial irregularities to a reported probe by the Department of Justice as claimed by the Wall Street Journal, investors have been panic-selling Supermicro stock. Additionally, the news of a delay in the filing of the company’s annual 10-K seems to have added to the bearishness.

However, investors should note that Hindenburg’s allegations are likely to be biased as the short-seller would have an interest in seeing Supermicro fall, and it remains to be seen if their points have any credibility. Additionally, there is no confirmation from the Justice Department if it is indeed probing Supermicro. Of course, Supermicro has a history of “improper accounting,” which is probably why investors have been panicking.

But at the same time, investors should note that nothing has been proven yet, nor is it certain there is a probe by the Department of Justice into the company. However, what’s worth noting is that Supermicro has been addressing the issue of higher heat generation in AI data centers with its liquid-cooled server solutions.

The stock popped significantly on Oct. 7 after it announced that it has shipped over 2,000 liquid-cooled server racks since June. Additionally, Supermicro points out that more than 100,000 GPUs are set to be deployed using its liquid cooling solutions on a quarterly basis. The company claims that its direct liquid-cooled server solutions can help achieve up to 40% energy savings and 80% space savings, which probably explains why its server racks are witnessing solid demand.

Even better, Supermicro management pointed out last year that it can deliver 5,000 liquid-cooled server racks per month, and it won’t be surprising to see its capacity utilization heading higher as data center operators look to reduce costs and energy consumption. After all, Supermicro says that the potential “40% power reduction allows you to deploy more AI servers in a fixed power envelope to increase computing power and decrease LLM time to train, which are critical for these large CSPs and AI factories.”

Meanwhile, the overall demand for liquid-cooled data centers is forecast to grow at an annual rate of over 24% through 2033, generating annual revenue of almost $40 billion in 2033 as compared to $4.45 billion last year. Supermicro has already been growing at an impressive pace and this new opportunity attributable to the higher heat and electricity generation in data centers could give it an additional boost.

Of course, investors would be looking for more clarity about the company’s operations following the recent developments, but one shouldn’t forget that Supermicro’s earnings are forecast to increase at an annual rate of 62% for the next five years. So, this AI stock should be on the radar of investors looking to make the most of the opportunity presented by the AI-related challenges discussed in this article.

Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Goldman Sachs Group, Meta Platforms, and Nvidia. The Motley Fool has a disclosure policy.



Source link

About The Author

Scroll to Top