The Third GMIF2024 Innovation Summit was successfully held at the Renaissance Shenzhen Bay Hotel in recent days. The event brought together senior executives and industry experts from renowned domestic and international companies, including the School of Integrated Circuits at Peking University, Micron, Western Digital, Solidigm, Arm, UNISOC, Intel, iFlytek, Rockchip, Silicon Motion, BIWIN, Victory Giant, Allwinner, InnoGrit, Maxio, QUANXING, Montage, Applied Materials (AMAT), Lam Research, DISCO, Skyverse, Loongson Technology, and many others. The gathering saw industry leaders engage in discussions on global storage innovation and ecosystem collaboration for shared growth in the AI era. During the summit, the GMIF2024 Annual Awards were officially unveiled, and a total of 38 key enterprises from across the industry supply chain had been honored for their outstanding contributions.
Chloe Ma, Vice President of China GTM for IoT Line of Business at Arm, delivered a keynote speech at GMIF2024 Innovation Summit titled “Empowering AI: The Crucial Role of Next-Generation Storage and Memory Solutions.” Ms. Ma had dove deep into the future trends in the AI storage industry and showcased Arm’s crucial role as a global leading computing platform company in driving the industry transformation.
Chloe Ma, Vice President of China GTM for IoT Line of Business at Arm
Storage Systems Are Crucial in AI Technology Evolution
Ms. Ma mentioned that with the birth of ChatGPT and the rapid development of large models and generative AI, we’re standing at the threshold of a transformative new era in AI, a landmark moment akin to the “iPhone moment” in the tech world. From Netflix and Twitter to TikTok and ChatGPT, user growth for these applications is accelerating at an unprecedented pace, driven by the powerful force of AI technology. Beyond being fluent and knowledge-rich, large models are steadily advancing in logic and reasoning capabilities. More significantly, they deliver a user-friendly interface, allowing people to effortlessly explore the vast possibilities of a general-purpose platform.
In the evolution of computing technology, centralized and distributed computing have exhibited an alternating development pattern. Currently in the midst of the popularization period of AI computing, it’s highly dependent on centralized cloud data centers to support AI deep training and massive data processing, which has promoted the maturation of business models such as AI cloud services and Token factories. The accelerating deployment of AI models in the edge has stimulated AI computing power to be quickly distributed to edge devices. This trend first emerged in existing mainstream computing platforms like smartphones and PCs, which achieve support for small language models through integrated AI+ supported hardware such as CPUs, GPUs, and NPUs. Meanwhile, in emerging edge computing fields like autonomous driving and robotics, the application of AI large models is driving various industries toward automation and intelligence, advancing the development of new productive forces.
Furthermore, Ms. Ma clarified that large models face bottlenecks in memory development. Model intelligence depends not only on high computing power but also on the scale of datasets and the number of parameters, both of which are inseparable from data storage and transmission. Memory and storage are crucial for AI computing from cloud to edge. To maintain high-speed operation of AI computing engines, memory storage systems need to efficiently input data to reduce AI training time and improve AI inference speed. As a computing platform company, Arm holds a unique position in the AI process, supporting various AI servers and storage controller requirements.
Leading IP Computing Platform Supports Emerging AI Applications and Workloads from Cloud to Edge
As a computing platform company with over 30 years of history, Arm started in Cambridge, UK, and after years of development, completed its second IPO last September. Within just one year after listing on NASDAQ, its market value has exceeded $150 billion. Technologies from Arm are rapidly emerging as the dominant foundation for AI computing, which have been widely applied in every corner where data touches from cloud to edge, including servers, storage, controllers, smart NICs, and edge devices.
Regarded as the world’s most pervasive computing platform, Arm is carrying various emerging AI applications and workloads from cloud to edge, specifically from Bluetooth earphones capable of multi-language translation to Meta Ray-Ban’s AI smart glasses. Arm provides support for high-performance real-time decision-making while achieving low power consumption. In cloud data centers, Arm architecture-based servers such as AWS Graviton, NVIDIA Grace Hopper, Google Axion, and Microsoft Cobalt are favored for their high performance, low power consumption, and strong security.
Nearly 30 Years of Engagement in Storage Empowers Industry Partners' Innovative Development
Arm has maintained continuous investment in the storage sector for nearly 30 years. Having entered the storage controller space with its investment in Palmchip in 1997, Arm architecture and platform-based storage device shipments have now reached close to 20 billion units. Arm’s Cortex-R series real-time processors have become the preferred choice for many storage processors with their fastest interrupt latency and real-time response; Cortex-M series embedded processors support custom instructions for backend flash and media control; Cortex-A series application processors support high-performance processing with high-throughput pipeline design and feature ML data processing and operating system ecosystem support.
Ms. Ma also introduced Arm’s latest real-time processor, Cortex-R82, which is designed specifically for the storage market, supporting larger address space and capable of loading MMU to run RichOS systems like Linux. The latest AI+ Ethos-U85 can efficiently support Transformer networks. Additionally, Arm’s system IP such as Cimeo interconnect IP can be used to support Arm servers and disaggregated memory pools.
Ms. Ma emphasized that Arm’s success is built on the support of its partners. For example, Solidigm has launched SSD products optimized for capacity and performance in response to AI’s different storage needs; Silicon Motion has adopted Arm processors across its product line to achieve strong security, low power consumption, and high performance in storage control. ScaleFlux, a leading startup in computational storage, plans to enable Cortex-R82 in its next-generation enterprise-grade SSD controllers and explore new applications in computational memory.
Arm Received the 2024 Annual "Outstanding Industry Promotion Award"
In closing her speech, Ms. Ma gave a hearty welcome to more emerging companies in the storage field to join Arm’s ecosystem to explore the unlimited possibilities of AI storage and build a smarter future world together. This keynote not only showcased Arm’s strategic layout and technical strength in the AI storage field but also painted a hopeful picture of an AI future. As Arm continues to advance technological innovation and ecosystem development, it’s affirmed that we will witness the emergence of even more exciting achievements.