Welcome to Red Hat AI’s 2nd What’s New and What’s Next presented by the Red Hat AI Product Management team. We had some audio difficulties at the beginning, but it gets better as the session goes on—we’re learning as we go.
Red Hat AI is a portfolio of products and services designed to accelerate the development and deployment of AI solutions across hybrid cloud environments.
What’s New and What’s Next in Red Hat AI is a quarterly video series designed to keep customers and AI enthusiasts informed about the latest advancements in the Red Hat AI portfolio. Each session covers new capabilities, technical preview features, and the strategic vision for Red Hat AI. Like and subscribe to stay up to date on the latest features and developments in Red Hat AI.
In this 2nd product update, we explore the 2025 roadmap for Red Hat AI and the latest releases of Red Hat Enterprise Linux AI (RHEL AI) 1.5, as well as Red Hat OpenShift AI (RHOAI) 2.0. You’ll gain insight into the vision behind this portfolio and discover powerful new features that can help you launch and scale your AI strategy, like Red Hat AI’s third party module validation initiative.
Whether you’re a developer, data scientist, or IT professional, this video has something for you.
Head over to redhat.com to try out Red Hat Enterprise Linux AI (https://www.redhat.com/en/products/ai/enterprise-linux-ai) and Red Hat OpenShift AI: (https://www.redhat.com/en/products/ai/openshift-ai) and learn about our latest new product Red Hat AI Inference Server (https://www.redhat.com/en/products/ai/inference-server).
00:00 Welcome
07:18 Areas of focus for 2025 and beyond
11:42 Inferencing
13:38 Aligning AI to specific enterprise use cases
17:10 Agentic AI applications: Trends and stack
20:43 Inference-time scaling, customisation of reasoning models, and customisation of instruct models
23:47 RHEL AI 1.5 and future releases
25:26 Llama 3.3 70B Instruct as Teacher Model
26:07 Extending multilingual capabilities
27:23 Subset selection
30:12 Red Hat AI model customisation
31:48 Red Hat AI Accelerator support: NVIDIA, AMD, IBM
36:06 Red Hat OpenShift AI: What’s new and what’s next
36:46 Support for AMD GPUs
37:37 Run NVIDIA NIMs inside Red Hat OpenShift AI
39:10 Kubeflow Trainer-based multi-node and multi-GPU training capabilities
41:12 Distributed InstructLab on Red Hat OpenShift AI
41:59 Guardrails Orchestrator GA
42:57 Hardware profiles
44:33 LLM Compressor
46:54 Model catalog
48:02 Feature store
49:26 Platform-native RAG/Agentic with Llama stack
51:29 GPU-as-a-Service