Kushaagra Goyal
Staff Software Engineer at Rubrik, ex-CTO at Gan.AI, ex-Databricks
Palo Alto, California, United States
Actions
Kushaagra Goyal is an accomplished technology leader with deep expertise in engineering and AI infrastructure. He holds a Bachelor’s degree from the Indian Institute of Technology, Delhi (2016), and a Master’s degree from Stanford University, where he developed a strong foundation in industry best practices and innovative technology solutions.
Kushaagra began his career as a Software Engineer at Rubrik, contributing to patent-recognized projects in the USA. He later joined Databricks Inc., where he advanced compute infrastructure for Managed Spark Clusters, delivering scalable, high-performance solutions for big data applications.
Post that, as Chief Technology Officer at Gan.ai, a pioneering GenAI startup, Kushaagra spearheaded the development of foundational text-to-speech models for Indic languages. He architected the company’s AI-driven backend infrastructure and led a team of 20 engineers, fostering a culture of innovation and technical excellence. His strategic leadership accelerated product development and positioned Gan.ai at the forefront of generative AI advancements.
Currently, he leads the development of storage and compute platforms at Rubrik, focusing on large-scale data and AI applications. His work is instrumental in building resilient, high-performance systems that drive enterprise innovation.
Kushaagra is passionate about computational infrastructure for AI and combines deep technical expertise with strategic insight. He actively shares knowledge on AI model hosting advancements and is dedicated to exploring new frontiers in scalable, impactful AI technologies.
Patents:
1. https://patents.google.com/patent/US20210279108A1/
2. https://patents.google.com/patent/US11126508B2/en
Google Scholar:
https://scholar.google.com/citations?user=vQKDnM0AAAAJ&hl=en
LinkedIn:
https://www.linkedin.com/in/kushaagra-goyal/
Email:
kushaagra@alumni.stanford.edu / kushaagragoyal7@gmail.com
Links
Area of Expertise
Topics
Compute for your AI model: GPUs, LPUs, TPUs and beyond..
In the rapidly evolving landscape of computing, Graphics Processing Units (GPUs), Language Processing Units (LPUs), and Tensor Processing Units (TPUs) play pivotal roles in accelerating complex tasks, particularly in machine learning and artificial intelligence.
GPUs are renowned for their parallel processing capabilities, making them ideal for rendering graphics and handling large datasets. LPUs are specialized for optimizing natural language processing tasks, enhancing efficiency in understanding and generating human language. TPUs, developed by Google, are tailored specifically for training and inference of machine learning models, offering significant performance advantages for large-scale AI applications.
As we explore these technologies, we'll also look at emerging processing units designed for specific AI use-cases and the future of computational advancements.
Join me to dive into the intricacies of these processing units, their applications, and what lies ahead in the world of computing technology.
WeAreDevelopers World Congress 2026 - North America Sessionize Event Upcoming
DataWeek 2025 Sessionize Event
DeveloperWeek 2025 Sessionize Event
Kushaagra Goyal
Staff Software Engineer at Rubrik, ex-CTO at Gan.AI, ex-Databricks
Palo Alto, California, United States
Links
Actions
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top