[su_note note_color=”#e4e4e4″ text_color=”#000″ class=”hvr-grow”]Nvidia’s invention of the GPU in 1999 sparked the growth of the PC gaming market and has redefined modern computer graphics, high performance computing and AI. The company’s pioneering work in accelerated computing and artificial intelligence is reshaping trillion-dollar industries, such as transportation, healthcare and manufacturing, and fueling the growth of many others.[/su_note]
Nvidia is one of the leading forces when it comes to machine learning and artificial intelligence. The company has been building core pieces of technologies that enable data scientists and users to leverage AI & ML. We hosted Kevin Deierling, SVP of NVIDIA Networking, to talk about this work and also deep dive into the Data Processing Unit or DPU which he feels is at the core of AI/ML. Here are some of the topics we covered in this episode:
- The work Nvidia is doing in the networking and AI/ML space.
- What role is a data center or ability to process data quickly and more efficiently going to play in the cloud-native world?
- How does Kevin define a Data Processing Unit?
- What is driving the emergence and adoption of the Data Processing Unit?
- How does Kevin define data center in 2021 when procuring GPU and DPU can be a cost and time challenge, when everyone is moving to the cloud?
- What does the architecture of DPU look like?
- How is Nvidia planning to put its technology into the hands of developers so they can play with these technologies?
- What kind of use-cases are there for DPU? Who are the early adopters?
- How are DPUs leveraging cloud-native technologies like Kubernetes?