Talk | Technical | English

There’s no question that real-time capabilities are becoming increasingly important for supporting AI use cases in application stacks. At the heart of AI reference architectures is the feature store, the bridge between your data and machine learning models.

 

In the past year companies such as Netflix, Uber, Doordash, Airbnb, and Spotify have published their AI stack architecture, where Redis consistently serves as the online feature store, and combined with a model store and RedisAI inference engine, addresses the need to deliver real-time AI services. We demo, show the code on how to use Redis as a feature store and inference engine to deploy realtime-AI applications

Sponsors