Wednesday, June 14th
02:00-3:00 PM
B-101: Data Center Applications (Data Centers Track)
Paper Title: SmartNICs/DPUs Save Millions in Data Center Power Costs

Paper Abstract: Large data centers rack up bigger power bills every year. To meet the ever-increasing needs of organizations to solve problems more quickly, process larger and more complex data, and employ more AI technology, they are continually adding more power-hungry servers, switches, and storage. . One way to both increase server throughput and reduce energy consumption is to use accelerators such as DPUs or SmartNICs aimed especially at infrastructure tasks, including line-rate packet switching, data encryption, and feeding AI/DL models. Tests by Ericsson, NVIDIA, Red Hat, and VMware showed that the use of SmartNICs and DPUs makes data center centers more energy-efficient by reducing rack space, power consumption, and cooling costs. A typical data center can save millions of dollars a year.

Paper Author: John Kim, Director Storage Marketing, NVIDIA

Author Bio: John Kim is director of storage and DPU marketing in NVIDIA’s networking division. He helps customers and vendors benefit from high-performance network connections, SmartNIC offloads, and DPU acceleration, especially for storage, big data, AI and cybersecurity. A frequent blogger, conference speaker, and webcast presenter, John was chair of SNIA’s Networking Storage Forum. Before joining NVIDIA, he worked in solution marketing, product management, and alliances at NetApp and EMC. He earned a BA in economics from Harvard.