picture of a corporate office space with an on-off switch superimposed on it. AI: do companies really need GPUs?

AI: Do companies really need GPUs?

With all the AI hype, do companies really need GPUs or could other processors work as well?

It has been dubbed the “AI Arms Race” by more than one industry commentator. It is the frenzy of competition to introduce larger, more capable models of generative artificial intelligence (AI) able to generate text, images or video from user prompts on any topic.

All major software vendors have introduced their own version of generative AI, albeit mostly of the Large Language Model (LLM) type. The most famous however remains the pioneer product in the space, Chat-GPT from OpenAI, first launched in 2022.

Implications for GPU chips

Hardware vendors, particularly chip manufacturers, have also been active. Basically, because these algorithms need lots of microchips to crunch the mountains of data they need for their training.

Big chip vendors such as NVIDIA are touting GPU chips as best to get the most out of AI applications. This has dramatically affected the price and availability of GPUs. This is undeniably aggravated by substantial orders placed by large, influential players, who often get given precedence.

An nvidia chip on a PC board. AI: do companies really need GPUs?

Global competition between players like Google, Microsoft, Amazon, Baidu, and Alibaba, has dramatically increased demand for GPUs. Smaller players (relatively speaking), or corporates interested in testing the systems may find it impossible to secure the GPUs they need.

Having access to a partnership with a professional independent distributor can become an indispensable asset — allowing you to identify available stocks of valid GPUs wherever they may be in the world and at the best conditions.

Business case

While investment bankers continue to drive the hype, it is not clear to all industries how to make a business case. Deploying AI requires significant investments of software, hardware, employee training and time. The path to generating a clear return on that investment and meaningful competitive advantage is often quite fuzzy.

Moreover, as industry expert and author David Linthicum points out in this article on InfoWorld, contrary to the greater chorus out there, CPUs can often be more than sufficient for the vast majority of corporate AI use cases. 

Considering that CPUs are currently not the flavour of the month in the hype cycle, it may make more sense to stock up on CPUs for your specific corporate AI test or use cases.

For more specialized types of AI applications, companies could also consider field-programmable gate arrays (FPGAs) or associative processing units (APUs).

Requirements analysis

In conclusion, like for all big investments, companies should complete a careful and detailed requirements analysis on their intended AI business use case, including at the processing level, and not just to follow the hype being generated and echoed around the web.

Once you have identified your processing requirements, whether those are for GPUs, CPUs, FPGAs, or APUs, contact us with your request and we will find you the best conditions available — as our customers can testify.

Slowing down to do this analysis will save you time and money and improve your ROI!

Blog article categories:

Search news articles: