Call for Speakers

in 266 days

AI by the Bay

event starts

17 Nov 2025

event ends

19 Nov 2025

location

Scottish Rite Center Oakland, California, United States

website

ai.bythebay.io/


This November, AI By The Bay is coming to Oakland, where experts in AI, machine learning, and data engineering will come together. Meet us on November 17th for workshops* and November 18-19 for talks!


What to expect:

  • Distributed Systems – Scaling, deploying, and optimizing AI-powered distributed systems.
  • Data – Architecting high-throughput, multimodal AI data pipelines.
  • Thoughtful Software – Building maintainable, efficient, and scalable software for AI.
  • Open Source AI – The role of OSS in shaping the future of AI.

Buckle up for three days of unique talks from outstanding keynote speakers, valuable networking opportunities, exclusive perks from our amazing partners and unforgettable view from Oakland Scottish Rite Center!


Ready to shine a light on your experience? Our first wave of CFP opens on February 14 and wraps up on March 14. Pitch us your idea and show us what makes your perspective stand out. We can’t wait to hear from you! 


Get your tickets!


Stay tuned for updates:

Our newsletter

LinkedIn

Bluesky

YouTube

Twitter


*Workshops are not included in the main event and are paid separately


open, 19 days left
Call for Speakers
Call opens at 12:00 AM

14 Feb 2025

Call closes at 11:59 PM

14 Mar 2025

Call closes in Pacific Daylight Time (UTC-07:00) timezone.
Closing time in your timezone () is .

We’re looking for practical, engineering-driven talks from practitioners building full-stack AI solutions. Each session is 30 minutes and should focus on real-world experience, technical insights, and lessons learned.

______________________________________________________________________________________________

Tracks:

Distributed Systems

Data

Thoughtful Software 

Open Source AI

______________________________________________________________________________________________

Submission Deadlines:

First wave: March 14

Second wave: May 1

Final wave: July

______________________________________________________________________________________________

Distributed Systems

AI-powered applications are pushing the limits of distributed systems. From APIs to multiplatform clients to LLM-driven apps, modern full-stack architectures demand asynchronous programming to query LLM oracles at scale. With multimodal AI applications processing vast amounts of text, images, video, and logs, tried-and-true systems like Kafka and Spark keep data flowing—but they aren’t enough.

As AI workloads evolve, so do the challenges of testing, deployment, and monitoring in cloud environments. This track explores the engineering realities of AI infrastructure, including:

- Scaling distributed inference and model execution – Managing high-throughput LLM workloads with async execution, caching, and parallelized inference strategies.

- Deploying and orchestrating AI applications in production – Best practices for running AI workloads across Kubernetes, serverless, and hybrid cloud environments while handling failures and resource constraints.

- Optimizing system performance and efficiency – Reducing latency in AI-driven applications through better task scheduling, event-driven architectures, and adaptive resource allocation.

- Building reliable and resilient AI systems – Ensuring uptime, fault tolerance, and efficient coordination in AI-powered services using event-driven patterns and distributed task queues.

_______________________________________________________________________________________________
Data

Data is the foundation of AI, and as AI models grow in complexity, so do the demands on data infrastructure. From real-time streaming to batch pipelines, from vector databases to data warehouses, the modern data stack must support high-throughput, low-latency, and multimodal AI workloads.

This track explores how engineers are building scalable, efficient, and reliable data systems for AI, covering topics such as:

- Real-time and batch processing for AI-driven applications.

- Vector databases and retrieval-augmented generation (RAG) at scale.

- Data lakes vs. feature stores vs. warehouses– Choosing the right approach.

- Data pipelines for LLMs – Preprocessing, transformation, and efficient storage.

______________________________________________________________________________________________

Thoughtful Programming

Building great software isn’t just about writing code—it’s about writing maintainable, scalable, and efficient code that can evolve alongside AI-driven systems. This track focuses on tools, techniques, and programming paradigms that help developers build better software for AI, GPU acceleration, and distributed systems.

Topics include:

- Testing and debugging at scale – Strategies for testing and debugging multi-threaded and distributed applications.

- Developer workflows and tools – IDEs, containerized dev setups, and automation that improve productivity.

- Programming paradigms for AI and high-performance computing – Best practices in Python, Rust, Julia, and other languages shaping modern AI applications.

- Code management and maintainability – Structuring repositories for large-scale AI projects, balancing performance with readability, and adopting best practices for API design and backward compatibility.

______________________________________________________________________________________________

Open Source AI

This track is a free and flexible space for discussions on open-source AI development, including:

- The evolving OSS AI ecosystem – What’s thriving, what’s missing, and how communities are driving innovation.

- Building and maintaining OSS AI projects – Governance models, sustainability, and balancing openness with commercial interests.

- AI tooling and infrastructure – Open-source libraries, frameworks, and deployment tools that are changing the game.

______________________________________________________________________________________________


Have questions? Reach out at 📧 info@bythebay.io


Login with your preferred account


If you haven't logged in before, you'll be able to register.

Using social networks to login is faster and simpler, but if you prefer username/password account - use Classic Login.