Speaker

Akshay Mittal

Akshay Mittal

Staff Software Engineer | PhD Researcher in Cloud-Native AI/ML | Passionate About Scalable & Intelligent Solutions

Austin, Texas, United States

Actions

Akshay Mittal is an accomplished IT professional with ten years of experience as a full-stack developer and a strong interest in leadership. Currently, Akshay works at PayPal, focusing on building scalable, innovative solutions within a high-performing technical environment. He is also pursuing a part-time PhD at the University of the Cumberlands, researching cloud-native technologies utilizing AI/ML methods.

Having extensive experience as a consultant across diverse teams, Akshay quickly adapts to emerging technologies and is skilled in mastering new challenges. He holds certifications in AWS and GCP and actively mentors aspiring technologists. Akshay regularly contributes to the tech community through speaking engagements on cloud-native development and integrating AI/ML in modern software solutions. His passion lies in fostering community growth, technology leadership, and continuous learning.

Area of Expertise

  • Finance & Banking
  • Information & Communications Technology

Topics

  • AI
  • ML
  • Cloud-Native
  • Kubernetes
  • Artificial intellince
  • Security & Compliance
  • Software Engineering
  • DevOps
  • cloud-native technologies
  • Cybersecuirty
  • AI and Cybersecurity

AI-Powered Security: Strengthening Cloud-Native Applications Against Emerging Threats

As organizations rapidly adopt cloud-native technologies, security teams face evolving threats that traditional approaches struggle to mitigate. AI and machine learning are transforming application security—automating vulnerability detection, enhancing threat intelligence, and improving incident response. But how do we ensure AI itself is secure?

This session explores the intersection of AI and cloud application security, covering:

- How AI/ML-driven security tools are reshaping DevSecOps and threat detection.
- The risks of AI-powered attacks (e.g., adversarial AI, prompt injection) and mitigation strategies.
- Real-world case studies on securing AI-assisted development pipelines.

AI-Powered DevOps in Cloud App Modernization: Automating Deployments, Monitoring, and Resilience

As organizations accelerate cloud app modernization, one of the most transformative enablers is AI-powered DevOps. From automating deployments to intelligent monitoring and self-healing infrastructure, AI is rapidly reshaping how modern cloud applications are built, deployed, and managed.

In this 30-40 minute session, we will explore:

The evolving role of AI in DevOps pipelines to accelerate cloud app delivery and infrastructure management.
AI-driven deployment automation, monitoring, and incident management for modern cloud applications.
How generative AI models are enabling code generation, auto-remediation, and intelligent CI/CD pipelines.
Case studies from cloud platforms (AWS, Azure, GCP) showcasing AI-powered DevOps in action.

AI-Powered Cloud-Native DevOps: Accelerating Automation, Security, and Resilience

Integrating Artificial Intelligence (AI) and DevOps in cloud-native environments is transforming global software engineering, enabling enterprises to build, deploy, and manage resilient and scalable applications. AI-powered automation enhances DevOps workflows by optimizing CI/CD pipelines, predictive monitoring, and proactive security, reducing downtime and operational costs.
This session will explore AI-driven approaches to cloud application modernization, including intelligent deployment automation, self-healing infrastructures, and AI-assisted cybersecurity strategies. Attendees will gain insights into practical implementations across AWS, Azure, and GCP, along with emerging trends that will define the future of AI in cloud DevOps.

Key Takeaways:
• The role of AI in accelerating cloud DevOps and infrastructure automation.
• AI-driven deployment, monitoring, and security enhancements for cloud-native applications.
• Real-world case studies showcasing AI-powered DevOps innovations in global organizations.

AI-Driven Cloud App Modernization: From Legacy to Autonomous Systems

As enterprises race to modernize their cloud applications, AI is emerging as a game-changer—not just in automating development but in rethinking how applications evolve. This session explores how AI is transforming cloud app modernization, from intelligent refactoring and automated migrations to self-optimizing architectures.

We'll dive into real-world use cases, showcasing AI-assisted DevOps, predictive scaling, and autonomous debugging. Whether you're migrating legacy systems or building AI-native applications, this talk will equip you with strategies and tools to future-proof your software in the age of AI-driven development.

AI-Augmented DevOps: Building Secure and Scalable Cloud-Native Pipelines with GitHub Actions and Kub

Modern development teams are under constant pressure to ship faster, without compromising on security or compliance. In this session, we’ll explore how to supercharge your DevOps pipelines with AI-powered tooling, GitHub Actions, and Kubernetes to build resilient, secure, and scalable cloud-native applications. We’ll walk through a real-world DevSecOps implementation that integrates container security scanning, policy-as-code, anomaly detection with ML models, and automated remediation workflows. You’ll also see how to incorporate explainable AI into observability and incident response, improving trust and insight across engineering and security teams. Whether you’re modernizing monoliths or building greenfield microservices, this session will equip you with practical techniques to operationalize security and scalability from day one.

Target Audience:

DevOps Engineers

Cloud Architects

Full-Stack Developers

Security Engineers

Engineering Leaders and Tech Mentors

Level: Intermediate to Advanced

AI Software Stacks in the Cloud-Native Era: Architecting Scalable, Secure, and Intelligent Systems

Abstract:
As artificial intelligence rapidly reshapes software delivery, understanding the modern AI software stack is critical for engineers, architects, and DevOps professionals. This talk dives deep into the AI-powered software stack, covering how to design, build, and deploy scalable AI/ML workloads using cloud-native best practices.

Through real-world examples and a live demo, the session will explore how cutting-edge tools across the MLOps and DevSecOps lifecycle come together to deliver resilient, explainable, and automated AI systems at scale.

Key Topics Covered:

Overview of the modern AI software stack: From data ingestion to model serving

Integrating AI pipelines with Kubernetes, serverless platforms, and GitOps

Leveraging open-source tools like MLflow, Kubeflow, Ray, LangChain, and Hugging Face

Security and compliance in AI workloads (e.g., model explainability, SHAP, drift detection)

Observability, reproducibility, and automation in AI lifecycle

Live demo of an AI-powered cloud-native assistant ("CloudyBot")

Audience Takeaways:

A blueprint for building production-grade AI systems using cloud-native tools

Actionable practices for securing and monitoring ML workflows in real time

Insights into how organizations like PayPal are adopting AI/ML for scalable operations

How to integrate generative AI tools into enterprise DevOps ecosystems

Target Audience:
Software engineers, MLOps/DevOps practitioners, cloud architects, technical leaders, researchers, and AI/ML professionals

ML on K8s: Running AI Workloads with KServe and Kubeflow Lite

Machine learning is increasingly moving from notebooks to production—and Kubernetes is where the action is. However, deploying scale-based models with observability, versioning, and autoscaling can get complex fast.

In this session, we’ll explore how KServe, a CNCF incubating project, simplifies the process of serving ML models on Kubernetes. Using a minimal Kubeflow-lite setup, we’ll walk through a live deployment of an ML model (sci-kit-learn or HuggingFace) and demonstrate production-grade features like autoscaling, traffic splitting, and real-time monitoring.

This talk is aimed at developers, ML engineers, and platform teams looking to operationalize AI workloads without reinventing infrastructure.

What We’ll Cover:
- What is KServe? Where does it fit in the ML + K8s stack?
- How to deploy a lightweight ML model using YAML or CLI
- Autoscaling with KNative integration
- Multi-version model rollout and routing
- Metrics, logs, and basic auth options
- Live traffic simulation to trigger scale-up

Key Takeaways:
1. Understand how KServe simplifies ML inference in Kubernetes
2. Learn how to deploy, scale, and monitor ML endpoints using CNCF tools
3. Gain insight into real-world production patterns for ML models serving
4. Leave with a GitHub repo you can fork to deploy your own models
5. Discover how to bring AI to your K8s cluster in a mini session

Infrastructure as Code (IAC) workshop

The idea behind infrastructure as code (IAC) is that you write and execute code to define, deploy, and update your infrastructure. This represents an important shift in mindset where you treat all aspects of operations as software—even those aspects that represent hardware (e.g., setting up physical servers).

OWASP LASCON 2025 Sessionize Event

October 2025 Austin, Texas, United States

The Commit Your Code Conference 2025! Sessionize Event

September 2025 Dallas, Texas, United States

wedoAI 2025 Sessionize Event

August 2025

Austin Kubernetes Meetup User group Sessionize Event

September 2024 Austin, Texas, United States

DevOpsDays Austin 2020 Sessionize Event

May 2020 Austin, Texas, United States

DeveloperWeek Austin 2019 Sessionize Event

November 2019 Austin, Texas, United States

Akshay Mittal

Staff Software Engineer | PhD Researcher in Cloud-Native AI/ML | Passionate About Scalable & Intelligent Solutions

Austin, Texas, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top