4 October 2023

Digital Technology Guru

Digital Technology Guru Reviews

The Latest Advances in AI Hardware: Nvidia, Cadence, and Ceva Release New AI Chips

2 min read
The Latest Advances in AI Hardware: Nvidia, Cadence, and Ceva Release New AI Chips

A recent report from Grand View Research predicts a significant growth of 37% in the AI industry 2030. As software advancements in AI continue to accelerate, hardware companies are under pressure to keep up with the pace of innovation. In response, several major players in the industry have recently launched new AI hardware products.

Nvidia made an announcement at its SIGGRAPH 2023 event, unveiling the GH200 Grace Hopper AI Superchip. This new chip is specifically designed for generative AI workloads and features a dual-configuration architecture that provides increased memory capacity and bandwidth. It is powered Nvidia’s Grace Hopper architecture and includes 144 Arm Neoverse cores and the latest HBM3e memory technology, enabling it to run models that are 3.5x larger while achieving up to eight petaflops of AI performance.

Cadence has introduced the eighth generation of its Xtensa LX processor family. The new LX8 processor platform is designed for AI system-on-a-chip (SoC) designs and offers a balance of power and performance. With improved L2 cache, optimized branch prediction, and upgraded 3D DMA transfers, the LX8 meets the requirements of edge and automotive applications. The processor is currently shipping to early-access customers and will be widely available in the late third quarter of 2023.

Ceva has developed the NeuPro-M NPU family, doubling down on generative AI hardware. The NeuPro-M NPU utilizes heterogeneous coprocessing to enhance parallel processing within and between its internal engines. This approach significantly increases the system’s overall processing capability. Additional features of the NeuPro-M include orthogonal memory bandwidth reduction mechanisms and a decentralized architecture for efficient NPU management. The device offers power efficiency of 350 TOPS/Watt and can range from 4 TOPS up to 256 TOPS per core.

These advancements in AI hardware are driven the increasing demand for edge AI devices. According to research from ABI Research, edge AI shipments are expected to grow at a compound annual growth rate (CAGR) of 22.4% from 2023 to 2028. By 2028, it is estimated that 6.5 billion edge AI units will be shipped annually. Companies like Nvidia, Cadence, and Ceva are upgrading their AI processing platforms to meet this growing demand while improving performance and power efficiency.

Sources:
– Grand View Research
– Nvidia
– Cadence
– Ceva
– ABI Research