![[vLLM Office Hours #37] InferenceMAX & vLLM - November 13, 2025](https://i4.ytimg.com/vi/kK6Ta4OZiJE/hqdefault.jpg)
Join us for our vLLM Office Hours on November 13, 2025, at 2:00 PM EST! These bi-weekly sessions are your chance to stay current with the vLLM ecosystem, ask questions, and hear directly from contributors and power users.
This week’s special topic: InferenceMAX – Open Source Inference Benchmarking
We’ll kick off with our bi-weekly vLLM update from Michael Goin, followed by Kimbo Chen and Cam Quilici from SemiAnalysis, who will break down InferenceMAX, an open-source continuous benchmarking framework that sweeps popular LLMs across hardware and software stacks to track real-world throughput, latency, and cost-efficiency.
Want to join the discussion live on Google Meet? Get a calendar invite by filling out this form: https://red.ht/office-hours











