Session

Use AI to accelerate the app release process

Modern software delivery is often throttled by manual release bottlenecks. This session demonstrates how to integrate Large Language Models (LLMs) to transform the release lifecycle—minimizing human error and shifting focus from administration to innovation. We will provide a practical roadmap for building an AI-powered pipeline that automates documentation and preempts deployment failures, significantly accelerating your time-to-market.

Key Takeaways:

Automated Release Notes: Using LLMs to generate user-ready documentation instantly.

Intelligent CI/CD: Leveraging AI log analysis to proactively identify and fix pipeline issues.

Implementation Strategy: Best practices for ensuring AI accuracy, security, and governance.

Harnessing the power of Large Language Models (LLMs), can significantly accelerate the release process. This presentation will demonstrate how to build a pipeline that automates key tasks like generating release notes and analyzing CI/CD pipeline logs. The goal is to shift focus from tedious administrative work to rapid innovation.

Teresa wu

VP Engineer at J.P. Morgan, GDE Flutter/Dart

London, United Kingdom

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top