Speaker

Tina Tsou

Tina Tsou

Kubernetes, High-Availability, Fault Tolerance, Cloud-Native Applications

Actions

Tina Tsou, InfiniEdge AI Project Lead, is a recognized leader in open source software, cloud infrastructure, and edge computing. She chairs the Kubernetes Edge Day events under the Cloud Native Computing Foundation (CNCF) and serves as the Board Chair of LF Edge. Previously, Tina was a Domain Expert in Connectivity at Philips Lighting. An advocate for open source technology, Tina often speaks at industry events and mentors emerging tech talent.

The case for Open Source in Autonomous AI Systems in the edge

In the modern era of artificial intelligence, the quest for creating truly autonomous systems has taken center stage. This talk will delve into the intricate realm of autonomous AI systems and the crucial role open-source plays in its advancement. Also we will make a case of why it is important to develop these capabilities on the edge and not rely solely on cloud for safety, efficiency and privacy concerns.

By leveraging open-source tools and frameworks, developers can unlock a plethora of opportunities in creating AI systems that can make decisions, learn from experiences, and evolve over time. This session will not only highlight the benefits of integrating open-source in the development of autonomous edge AI but also showcase real-world examples and case studies that demonstrate the transformative impact of open-source on AI innovation at the edge and beyond .

Kubernetes for Edge Computing: A Guide to Building Resilient and Scalable Applications

In this session, we will delve deep into the best practices for implementing edge computing solutions in Kubernetes. As edge computing becomes increasingly important in today's IoT-driven world, mastering the techniques to deploy and manage workloads effectively at the edge becomes crucial. We will explore the challenges of edge computing, including network instability, hardware diversity, and geo-distribution, and demonstrate how Kubernetes can effectively manage these issues. With real-world examples and case studies, we will illustrate how Kubernetes can power robust edge computing systems and what future trends to expect in this domain. Attendees can look forward to gaining a comprehensive understanding of the best strategies and techniques for Kubernetes-based edge computing deployments and how to prepare for the future of edge computing."

Harnessing AI for Smarter Networking: Strategies and Innovations

In this session, we will explore the transformative role of AI in networking. We’ll discuss cutting-edge AI strategies that are reshaping network management, security, and efficiency. The talk will cover practical applications of AI in predictive analytics, automated troubleshooting, and network optimization. We aim to provide insights on leveraging AI for more intelligent, resilient, and efficient network infrastructures.

Gen AI at the Edge: How Cloud Native Technologies Enable the Next Wave of Intelligent Applications

Generative AI (Gen AI) is a branch of AI that can create novel and realistic content, such as text, images, audio, and video. Gen AI has many potential applications in various domains, but also poses significant challenges for edge computing. Gen AI models, especially LLMs, require high-performance computing, large memory, and massive data, which are not always available at the edge. How can cloud native technologies, such as Kubernetes, containers, and microservices, help overcome these challenges and enable Gen AI at the edge? What are the trade-offs between cloud and edge for Gen AI? And what are the benefits and opportunities for the cloud native and edge computing ecosystems?
Join this panel with us to discuss the state and future trends of Gen AI at the edge, the best practices and tools for cloud native Gen AI development and deployment, and the challenges and solutions for optimizing the performance, efficiency, and security of Gen AI models on different edge devices, etc.

Towards a New Era of Edge Computing - Open Software on Open Hardware

The surge of edge computing is propelled by an increase in edge applications and devices, highlighting the benefits of low latency, enhanced security, and real-time responses. LF Edge, is aiming to establish an open, interoperable software framework for edge computing, independent of hardware, silicon, cloud, or operating system. Meanwhile, open-source hardware platforms are gaining ground in IoT and embedded systems, furthering the expansion of edge computing.

This blend of open software and hardware can herald a new era of edge computing - Open Software on Open Hardware. While promising to expedite the adoption of edge computing, it also introduces new challenges within the edge ecosystem.

We'd like to discuss the future of this open edge platform, including its current state, benefits, challenges, and problems. We also need to determine the direction for the edge community in embracing new technologies like WebAssembly, developing potential solutions, running edge-native applications, and empowering edge use cases like edge AI. Let's explore this exciting new era of open edge platforms together.

Tina Tsou

Kubernetes, High-Availability, Fault Tolerance, Cloud-Native Applications

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top