Session
From Vectors to Pods: Integrating AI with Cloud Native
The rise of AI is challenging long-standing assumptions about running cloud native workloads. AI demands hardware accelerators, vast data, efficient scheduling and exceptional scalability. Although Kubernetes remains the de facto choice, feedback from end users and collaboration with researchers and academia are essential to drive innovation, address gaps and integrate AI in cloud native.
This panel features end users, AI infra researchers and leads of the CNCF AI and Kubernetes device management working groups focussed on:
- Expanding beyond LLMs to explore AI for cloud native workload management, memory usage and debugging
- Challenges with scheduling and scaling of AI workloads from the end user perspective
- OSS Projects and innovation in AI and cloud native in the CNCF landscape
- Improving resource utilisation and performance of AI workloads
The next decade of Kubernetes will be shaped by AI. We don’t yet know what this will look like, come join us to discover it together.
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top