It was in the early to mid 2000s when Amazon first popularized the cloud computing infrastructure model, selling its own underutilized capacity to customers who wanted to take advantage of the then-nascent technology of running applications on virtual machines remotely.

At around the same time, applications that are the de facto productivity tools now also came to market, such as Google Docs and Google Spreadsheets. Businesses discovered that they could run their operations more efficiently and cost-effectively this way. Resources can be realigned toward opex rather than buying expensive on-premises hardware and buying or renting rack space.

Fast forward to today, cloud computing is today one of the fastest-growing technology industries. Set to breach the trillion-dollar market value in five years’ time, the global cloud computing industry supports anything from individual usage of document processing and communications, to large-scale enterprise workloads. And today, what’s a more significant industry than artificial intelligence (AI) and all its derivatives? Thanks to the popularity of ChatGPT and its contemporaries like Microsoft’s own Copilot and Google’s Gemini, both business and individual users are now enjoying the capabilities of large-language models (LLMs) for number-crunching, analysis, and even content generation.

There is, however, a potential challenge that comes with the meteoric growth of AI. All that training, modeling, and data processing means it is an industry that is increasingly hungry for computing power.

A potential shortage in hardware

Chipmaker Nvidia is today considered to be the premier provider of AI-focused processing chips. With the growth in demand for its H100 graphics chips and software–which power training for AI models–Nvidia has grown to be the fifth-most valuable company in the world, with a market value of over $1 trillion. Nvidia Chief Executive Officer Jensen Huang himself declared that “demand is surging worldwide” as generative AI hits a “tipping point.”

In the Southeast Asia market, Singapore is already emerging as a top provider of AI infrastructure, and the city-state reportedly accounts for 15 percent of Nvidia’s revenue in the third quarter of its 2024 fiscal year, amounting to $2.7 billion. “We are in the first year of a ten-year period of intelligent data center transformation,” says Huang, who underscored Singapore’s regional importance to the industry in a tour across four Asian countries.

The rise of AI does not come without potential challenges, however. Analysts say Nvidia controls at least 80 percent of the global AI chip market. With demand set to skyrocket, AI startups are scrambling to obtain computing and GPU power for use on their respective platforms, which might result in a supply crunch. OpenAI, which runs ChatGPT, is reportedly buying Nvidia H100 chips by the tens of thousands.

Tech companies, meanwhile, acquire access to AI chips through cloud computing providers like Amazon, Google, and Microsoft, but are now facing roadblocks that could result in year-long waits before they can gain access to such computing power.

Nvidia itself is set to launch its even more powerful H200 AI-oriented GPUs later in 2024. However, with a backdrop of potential roadblocks and delays, where can the AI industry turn to? This might be found in another market where GPUs play a strategic importance: gaming.

A global network of computing resources

The gaming community comprises around 3.09 billion video game players worldwide, and Asia alone is home to around half of this population. There are around 1.06 billion gaming PCs that provide an untapped potential for resources. These are mostly computers that lie idle for most of the day when their owners are not playing resource-intensive games, according to Andrew Faridani, Chief Marketing Officer of GAIMIN.

“Gamers only typically use their PC’s for gaming for 4 hours a day, leaving the PC switched on, and therefore unused for around 20 hours per day,” says Faridani, whose company focuses on leveraging these resources into what is essentially a distributed infrastructure for processing power. “In addition, if a player has a very highly performing device, even when gaming, their devices are not used to their full capability, providing spare processing power for GAIMIN to use.”

The platform can potentially utilize the billions of gaming PCs–with their powerful GPUs–across the globe for providing processing resources that can power the hungry AI applications of today.

“GAIMIN’s DePIN–or decentralized processing infrastructure network–supports the handling of data processing jobs directly from clients, or it can operate as a scalable extension to other powering service providers – if their demand exceeds capacity, they can turn to GAIMIN for ad hoc, extensible powering as and when required,” Faridani adds.

Taking full advantage of resources and assets

Given the surge in demand for AI-focused computing, however, we are at a point wherein AI startups and businesses have to find solutions fast. One good question to ask is why gamers and other computer owners are not providing such resources to be available in the first place. Well, firstly, the availability of cloud computing and cloud services has made us take online networks for granted. Secondly, there is the aspect of utility and benefit. What use would it be for gamers to offer up their computing power for other people or businesses?

In terms of the benefits, there must be a win-win situation, and this can be addressed by providing gamers who share their resources an opportunity to passively earn from the participation of their hardware in the network. For example, GAIMIN owns and runs the Gaimin Gladiators esports team, whose 10 teams play in Tier 1 esports tournaments globally.

Digital currency generated from participating in the GAIMIN cloud network can be used to receive exclusive content relevant to its esports teams, and it can also be staked or traded for its value in the digital asset market. It can also be utilized on the company’s own gaming platform, which not only enables participants to earn cryptocurrency from their otherwise idle computing resources, but it also promotes the mainstream appeal of esports.

Perhaps now is the best time for businesses and users alike to get involved in the esports space. While still considered a nascent industry in Southeast Asia, esports is a rapidly growing industry and community, with the ASEAN region alone expecting around $79.7 million in revenues this year, growing at an 8.02 percent CAGR to $108.5 million by 2028.

Analysts expect growth from the creation of new teams, tournaments, and companies in this space, which results in wider opportunities for sponsorships, advertisements, and collaborations.

Thus, the additional benefit that the esports community can contribute to AI is promising for both businesses, game developers, and the gaming community through a shared computing economy.

Toward sustainability in AI applications

The global gaming community potentially holds the key to supporting the sustainability of powerful AI applications. By harnessing the vast, untapped resources of gaming PCs worldwide, potentially billions of users can address both the short-term and long-term needs of AI developers that rely heavily on processing power to make their products work.

This model not only addresses the urgent demand for AI processing but also empowers gamers and other end-users to earn tangible rewards through digital currency through attractive esports content.

As AI continues to transform industries, this innovative collaboration between gaming communities and tech startups offers a scalable, efficient, and ethical solution. The AI revolution can remain powerful and sustainable, fueled by the collective resources of the passionate gaming world.

TNGlobal INSIDER publishes contributions relevant to entrepreneurship and innovation. You may submit your own original or published contributions subject to editorial discretion.

Image credit: Pixabay