Whether it’s for training or for inference, data is the bedrock of AI applications. Today, inference workloads get the most buzz thanks to the explosion of interest in chatbots, but inference is also essential for real-time decision-making in Internet of Things (IoT) devices, mobile apps, smart sensors and more.
Powering modern AI workloads is no small feat, requiring ultra-low latency, high availability and real-time data processing and synchronization. What’s already a tall order is made all the more challenging in distributed…








