AI is transforming the way organizations work globally. It makes everyone look differently at all scales, from workstations to data centers and edge devices. Intel’s strategy heavily focuses on AI as the next milestone to drive innovation and enable other organizations to move their projects beyond experimentation.
Open source solutions lower the barrier to entry and ensure portability across different platforms. Organizations are looking not only to get started with open source projects but also to run their AI initiatives in production. Furthermore, since hardware is often a limited resource, optimization of their capabilities is always a priority. Intel’s investment in open source frameworks and projects such as vLLM and OPEA are shifting the focus from hardware only to hardware and software that is seamlessly integrated. It helps customers run their AI workloads at all scales and further fit in the bigger open source ecosystem, integrating with other tools such as Charmed Kubeflow or Data Science Stack.
During the talk, you will learn more about the transformation that Intel went through with the rise of AI/ML. You will discover more about how the company has shifted priorities in order to enable users with different knowledge levels and organizations at different scales to run their workloads and productize their projects.
🎤 Speaker: Bill Pearson, VP, AI Software Engineering, Intel
Learn more at https://canonical.com/solutions/ai and https://canonical.com/data