© Mapbox, © OpenStreetMap

Speaker

Nnenna Ndukwe

Nnenna Ndukwe

Developer Relations at Unleash

Boston, Massachusetts, United States

Actions

Nnenna Ndukwe is a Developer Advocate and a Software Engineer, enthusiastic about DevOps and AI. She's an experienced software engineer in medtech, fintech, and media tech. She studied Computer Science at Boston University and is a Resilient Coders alum. A proud member of Women Defining AI, Tech Ladies, CNCF, Open Source Security Foundation, Cyber Women of Boston, and more. She's a Google Women Techmakers Scholar, NSBE Engineering award winner, and an international speaker. She shares her journey with the world via content creation, volunteering, and speaking at tech events.

Area of Expertise

  • Information & Communications Technology
  • Region & Country

Topics

  • DevOps
  • Artificial Intelligence
  • Machine Learning
  • Python Programming Language
  • Infrastructure as Code
  • Security
  • Software Deveopment
  • Software Engineering
  • python
  • DevSecOps
  • Google AI
  • LLMs
  • LLMOps

Democratizing AI: Building Resilient and Secure Open-Source LLMs for Digital Sovereignty

This presentation explores the critical role of open-source LLMs in achieving digital sovereignty. We will delve into the technical challenges and opportunities in building secure and resilient open-source LLMs, focusing on practical strategies for data privacy, model security, and community governance. We will examine case studies showcasing successful community-driven initiatives and discuss best practices for fostering collaboration and knowledge sharing within the open-source ecosystem.

When Feature Flags and AI Collide: Building Intelligent Progressive Delivery Systems

Traditional feature flags are binary decisions, but what if they could learn and adapt? In this session, we'll explore how to build an intelligent feature management system that uses machine learning to make data-driven deployment decisions. You'll learn how to combine PyTorch, Prometheus, and Kubernetes to create a system that automatically controls feature rollouts based on real-time metrics and learned patterns.

Through live demonstrations and practical examples, you'll discover how to evolve your feature management from simple toggles to an intelligent system that learns from your users and system metrics. By the end of this session, you'll have the knowledge to implement ML-powered progressive delivery in your own organization. This talk is for platform engineers, DevOps practitioners, and ML engineers who want to take their feature management to the next level. Attendees should have basic familiarity with Kubernetes and Python. Experience with feature flags or ML is helpful but not required. You'll leave understanding how to combine these technologies to create more intelligent deployment strategies.

Choose Your Fighter: A Pragmatic Guide to AI Mechanisms vs Automated Ops

Engineers get bombarded with industry noise about integrating AI into all software/platforms. But not every tool needs to leverage AI, so how do we think pragmatically about solid use cases for them? This talk emphasizes clear distinctions between automation and AI mechanisms to encourage implementations that truly solve problems without over-engineering. We'll explore mechanisms of both paradigms, dissect their strengths and limitations, and choose the right tool for your use case.

This talk is geared for software architects, technical leads, and senior engineers who make strategic technology decisions. This presentation is also great for general Software Engineers, DevOps practitioners, and any AI/ML enthusiasts who are bombarded with industry noise about integrating AI into all software/platforms. Sometimes, AI isn't the solution. Automated workflows are. Not every tool needs to leverage AI, but how do we pause to think more strategically and pragmatically about solid use cases for AI? This talk can emphasize clear delineations between automation and AI mechanisms to encourage implementations of both for specific use cases in order to truly solve problems without over-engineering solutions.

Basic understanding of automation tools and DevOps practices, familiarity with AI/ML concepts (no deep expertise required), and experience with distributed systems and application architecture is helpful in following along with this talk.

Key Takeaways:
- Framework for evaluating automation vs. AI solutions for specific use cases
- Understanding of integration patterns for hybrid solutions
- Risk assessment strategies for each approach
- Best practices for implementation and maintenance

From DevOps to MLOps: Bridging the Gap Between Software Engineering and Machine Learning

Both DevOps and MLOps aim to streamline the development and deployment lifecycle through automation, CI/CD, and close collaboration between teams. But there are key differences in the purposes and applications of DevOps and MLOps. This talk demonstrates how your existing DevOps expertise creates a strong foundation for understanding and implementing MLOps practices. We'll explore how familiar concepts like CI/CD, monitoring, and automated testing map to ML workflows, while highlighting the key differences that make MLOps unique.
Through practical examples, we'll show how software engineers can apply their current skills to ML systems by extending DevOps practices to handle model artifacts, training pipelines, and feature engineering. You'll learn where your existing tools and practices fit in, what new tools you'll need, and how to identify when MLOps practices are necessary for your projects.

Attendees should have experience with DevOps practices and general software engineering principles. No ML or data science experience is required - we'll focus on how your existing knowledge applies to ML systems.

Prerequisites: Familiarity with CI/CD, infrastructure as code, monitoring, and automated testing. Experience with containerization (e.g., Docker) and cloud platforms is helpful but not required.

Building Local AI Agent Armies: Accelerating Developer Content Creation Through Autonomous Systems

New AI capabilities are releasing constantly, but it's not always clear how they can be leveraged in real-world use cases. Developer Relations teams face a unique opportunity to scale their content creation through autonomous systems. This talk explores how to build and orchestrate a fleet of specialized AI agents running on your local machine to generate high-quality technical demos, tutorials, and documentation relevant to your community and business goals. We'll dive into practical approaches for designing agent architectures, implementing effective prompting strategies, and maintaining content quality at scale.

Prerequisites:
Basic understanding of Large Language Models and prompt engineering
Familiarity with container technologies (Docker, Kubernetes)
Experience with technical content creation
Basic understanding of CI/CD pipelines

Building with Confidence: Mastering Feature Flags in React Applications

Feature flags have become an essential tool in modern software development, enabling teams to deploy code safely, conduct A/B tests, and manage feature releases with precision. This session will take you on a journey from understanding basic feature flag implementation in React to advanced patterns used by high-performing teams. Through live coding demonstrations and real-world examples, you'll learn how to leverage feature flags to deploy confidently, experiment rapidly, and deliver value to your users continuously.

This talk is ideal for intermediate to advanced React developers, tech leads, and architects who want to implement or improve feature flag usage in their applications. Basic knowledge of React and modern JavaScript is required. Attendees will leave with a solid understanding of feature flag architecture in React applications, code templates and patterns they can implement immediately, best practices for feature flag management in production, strategies for scaling feature flags across large applications, and tools and resources for additional learning.

The New DevOps: Understanding and Implementing LLMOps

LLMOps is about applying DevOps principles to the world of large language models, with a focus on the unique challenges and opportunities presented by these powerful AI systems.

Securing the LLM Supply Chain: Research-Driven Risks and Mitigation

Developers, DevOps practitioners and technical decision makers are often encouraged to consider the security angle of the software supply chain. But what are the software supply chain security risks with LLMs? Let's dive into recent research that highlights security risks in the LLM supply chain, spanning across various components, including data collection, model training, and deployment. Learn about the upside LLMs show in addressing software security challenges and the mitigation strategies that will be critical to ensure the success and safety of LLMOps.

The Economics of Open-Source LLMs: Cost-Effectiveness and Community Sustainability

This presentation explores the economic advantages of open-source LLMs, examining the cost savings associated with licensing, training, and deployment. We'll explore various funding models for sustaining open-source AI communities, discuss strategies for attracting and retaining contributors, and analyze the long-term economic benefits of promoting a thriving open-source AI ecosystem.

The presentation will cover:
- Cost comparison of open-source vs. proprietary LLMs.
- Funding models for sustainable open-source AI communities.
- Strategies for attracting and retaining contributors.
- Long-term economic benefits of a thriving open-source AI ecosystem.
- Economic impact assessments of successful open-source LLM projects.

Building a Global AI Community: Collaboration and Knowledge Sharing in Open-Source LLM Development

This presentation highlights the importance of collaboration and knowledge sharing in the open-source LLM ecosystem. We will discuss successful community-building strategies, best practices for communication and collaboration, and the crucial role of documentation, education, and mentorship in fostering a thriving and inclusive community of AI developers.

The talk will cover:
- Effective community-building strategies for open-source AI projects.
- Best practices for collaboration and knowledge sharing.
- The role of documentation, education, and mentorship.
- Strategies for fostering diversity and inclusion within the community.
- Case studies of successful community-driven open-source LLM projects.

Building Bridges, Not Just Docs: The Context-Gathering Framework for Developer Relations Excellence

This talk presents a practical framework for Developer Relations professionals based on real experience at Unleash. Drawing inspiration from industry leaders and personal insights, I'll introduce a four-pillar Context-Gathering Framework that transforms how DevRel professionals approach their role. Attendees will learn concrete strategies for product mastery, customer understanding, market positioning, and industry context—all designed to create alignment and establish meaningful feedback loops that elevate developer experience.

Developer Relations sits at the intersection of product, marketing, and engineering—making it both a challenging and rewarding discipline. In this session, I'll share my journey as a Developer Advocate at Unleash and introduce a proven Context-Gathering Framework that has transformed my approach to DevRel.
Building on Sam Julien's foundational pillars (Awareness, Education, Feedback, Community) and Kurtis Kemple's emphasis on context gathering, I'll demonstrate how a structured learning approach across four key dimensions creates exponential impact:

Product Mastery: Moving beyond documentation to truly understand how your solution solves real-world engineering challenges
Customer Understanding: Developing clarity about who makes decisions, who implements solutions, and what technology ecosystems they inhabit
Market Positioning: Mapping the competitive landscape to articulate your unique value proposition effectively
Industry Context: Connecting your product's capabilities to broader technology trends and movements

Through practical examples and lessons learned, I'll show how this framework creates alignment between technical capabilities and customer needs, facilitates more effective communication, and ultimately drives adoption. Attendees will leave with actionable strategies they can immediately apply to their DevRel practice, regardless of company size or product complexity.
Target Audience
This talk is designed for:

Early to mid-career Developer Relations professionals looking to increase their impact
DevRel team leads seeking frameworks to guide their strategy
Marketing professionals transitioning into technical advocacy roles
Engineering leaders responsible for developer experience initiatives
Product managers interested in strengthening their connection to the developer ecosystem

Attendees will benefit most if they have some familiarity with developer marketing/relations concepts, though no specific technical expertise is required.

Women on Stage Global Conference

5 Security Best Practices for Production-Ready Containers

October 2023 Boston, Massachusetts, United States

Nnenna Ndukwe

Developer Relations at Unleash

Boston, Massachusetts, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top