It doesn’t take a genius to understand that artificial intelligence (AI) is transforming industries at an unprecedented rate. According to IDC, global spending on AI is expected to reach $632 billion by 2028, with generative AI (GenAI) growing at a remarkable annual rate of 59.2 percent. Yet, as AI capabilities surge, the infrastructure needed to support them is straining under the weight. This is impacting how quickly organizations can benefit from AI.
As McKinsey recently revealed, North American and Asian companies are eager to adopt AI, with 76 percent of North American companies and 70 percent of Asian companies already starting their AI transformations. However, to maintain their edge, leaders must actively pursue transformation, McKinsey advises. As of now, less than 10 percent of Asian organizations have found a way to drive value from multiple GenAI use cases. Those that do are likely to build a competitive advantage.
AI’s Potential in Malaysia
For Malaysia, the Malaysia National AI Roadmap 2021-2025, based on McKinsey’s data, projects that Malaysia’s baseline economic growth rate of around 4.4 percent could gain an additional 1.2 percent boost through the strategic integration of AI technologies. This highlights AI’s potential to significantly elevate the nation’s economic trajectory.
The Infrastructure Challenge
GenAI is driving much of AI’s growth, but it also requires immense computing power, vast data storage, and advanced algorithms. This has a significant impact on energy consumption, costs, sustainability, and performance. Traditional infrastructures are ill-suited to support these demands, so progress must go hand in hand with infrastructure modernization. Transformations are needed to ensure that any investments in AI are maximized.
Spending on AI infrastructure, which includes hardware such as servers and cloud infrastructure to support AI applications, is substantial but growing at a slower pace than GenAI adoption. AI infrastructure will see a 14.7 percent compound annual growth rate (CAGR) through 2028 (according to IDC research), reflecting earlier investments by cloud service providers. AI hardware and Infrastructure-as-a-Service (IaaS) represent about 24 percent of overall AI spending, underscoring its importance in enabling AI capabilities. While GenAI garners much attention, AI infrastructure spending remains critical for supporting broader AI growth and applications.
Security and Compliance Capabilities as Standard
AI models process vast amounts of data. Ensuring data security and maintaining compliance with regulatory standards is essential throughout the entire process of deploying AI solutions. Secure infrastructure that includes encryption, robust access controls, and compliance with global data protection regulations (such as GDPR) is crucial to safeguard both the models themselves and the data they process.
AI infrastructure must be designed not only for performance and scalability but also for security. Failure to secure AI applications or their supporting infrastructure can result in data breaches, regulatory fines, and loss of customer trust. Once trust is lost, it is almost impossible to regain.
Cloud-Native as a Foundation for AI Transformation
To meet AI’s growing demands, businesses must adopt cloud-native infrastructure, which includes powerful computing, high-performance networks and storage, container systems, and data management tools. Cloud-native infrastructure provides the flexibility and scalability needed to support AI’s increasing computational and storage requirements. Traditional infrastructures struggle to handle the massive data flows and high-performance needs of modern AI applications. However, cloud-native architecture allows businesses to rapidly scale their infrastructure to accommodate fluctuating demands, ensuring they have the computing power necessary for GenAI models and other data-intensive AI processes.
Cloud-native environments also support the agility necessary for efficient deployment, management, and updating of AI applications. These platforms are designed to seamlessly integrate with AI development workflows, enabling businesses to innovate quickly without being hindered by infrastructure limitations.
Scalable, Reliable, and Cost-Efficient Infrastructure for Data Management
As AI use cases multiply, the need for scalable and cost-efficient cloud infrastructure for data management and analytics becomes increasingly critical. Scalable IaaS and Platform-as-a-Service (PaaS) offerings ensure that data can be stored, processed, and accessed seamlessly, enabling faster and more accurate model training. Efficient data pipelines, robust storage solutions, and streamlined retrieval systems are essential for managing the large volumes of data required for model training. Innovative infrastructure also allows for customization and fine-tuning of models for specific use cases, improving the quality and relevance of AI applications.
Reliable infrastructure is also crucial for maintaining consistent user experiences. Downtime and crashes can erode user trust and disrupt operations. A solid infrastructure minimizes these risks, ensuring high availability and uptime.
Efficient AI infrastructure not only supports performance but also helps manage costs. By optimizing computing resources through distributed systems, containerization, and serverless architectures, businesses can avoid over-spending on cloud or hardware resources. Cost efficiency is vital for scaling GenAI applications while staying within budget.
Energy Efficiency and Sustainability Increasingly Key
As AI workloads grow, so do energy consumption and costs. AI models, particularly GenAI, are power-hungry, raising concerns about their environmental impact. Businesses increasingly recognize the need for energy-efficient infrastructure to support AI initiatives while keeping carbon footprints in check. Green data centers, renewable energy sources, and energy-efficient hardware are becoming essential components of AI infrastructure strategies.
Optimizing power consumption and investing in sustainable practices can reduce operational costs and help businesses meet sustainability goals. As AI adoption accelerates globally, energy-efficient infrastructure will become a key differentiator for businesses aligning innovation with corporate social responsibility and cost management.
As AI continues to evolve, businesses must address current infrastructure challenges and anticipate future shifts in the AI landscape. This includes focusing on security and regulatory compliance, as well as technical and sustainability needs. The convergence of real-time decision-making, augmented working environments, and the rising demand for sustainability means businesses must be proactive in their infrastructure strategies.
The risk of falling behind is real, but so is the opportunity to lead in this transformative era of AI. The question is no longer whether to invest in cloud infrastructure modernization but how quickly organizations can make the leap to stay competitive.
Kun Huang is the General Manager of Malaysia, Alibaba Cloud Intelligence.
TNGlobal INSIDER publishes contributions relevant to entrepreneurship and innovation. You may submit your own original or published contributions subject to editorial discretion.