sebae banner 728x900
sebae banner 300x250

Inference Time Scaling for Enterprises | No Math AI

0 views
0%

Inference Time Scaling for Enterprises | No Math AI

In Episode 3 of No Math AI, Red Hat CEO Matt Hicks and CTO Chris Wright join hosts Akash Srivastava and Isha Puri to explore what it really takes to scale large language model inference in production. From cost concerns and platform orchestration to the launch of llm-d, they break down the transition from static models to dynamic, reasoning-heavy applications and how open source collaboration is making scalable AI a reality for enterprise teams.

RSS feed: https://feeds.simplecast.com/c1PFREqr
Spotify: https://open.spotify.com/show/7Cpcy42VZxmQehTCDQ9oBe?si=c93beaa3cd0a47ef

For more episodes No Math AI subscribe to: @redhat ​

Date: June 12, 2025