Description
Network engineers have a good grasp on how to build data center networks to support all kinds of apps, from traditional three-tier designs to applications built around containers and microservices. But what about building a network fabric to support AI?
Today on the Tech Bytes podcast, sponsored by Nokia, we talk about the special requirements to build a data center fabric for AI use cases such as training and inference. While Ethernet is still likely to be involved, an AI fabric has to be optimized to meet particular demands.
Our guest to walk us through these requirements is Clayton Wagar, Principal Consulting Engineer at Nokia.
Show Links:
Enterprise cloud networks – Nokia
Today on the Tech Bytes podcast we welcome back sponsor MinIO to talk about how AI is altering the data infrastructure landscape, and why organizations are looking to build AI infrastructure on-prem. We also dig into MinIO’s AIStor, a software-only, distributed object store that offers...
Published 11/18/24
Today on the Tech Bytes podcast, sponsored by Palo Alto Networks, we talk with Palo Alto Networks customer Autodesk about how it migrated from SD-WAN and traditional remote access VPNs to SASE, or Secure Access Service Edge. We’ll talk about the trends that drove Autodesk’s migration and the...
Published 11/12/24