Rohit leads the Pivotal Labs App Modernization Practice in engineering, delivery training & cross-functional enablement, tooling, scoping, selling, recruiting, marketing, blog posts, webinars and conference sessions. Rohit has led multiple enterprise engagements including ones featured in the Wall Street Journal.
Rohit focuses on designing, implementing and consulting with enterprise software solutions for Fortune 500 companies. Rohit has has mastered digital transformation, streaming & microservices architectures, modern application design and legacy-to-modern software transformation techniques and processes.
Rohit is an expert on migrating applications to the cloud and decomposing monoliths. Rohit actively blogs on cloud foundry, kubernetes, decomposing monoliths and app modernization strategies on cloud.rohitkelapure.com. Rohit's recent webinars include topics as diverse as middleware migration, Mainframe migration and modernizing monolithic apps. Rohit has formulated GTM, marketing and product strategies for the Pivotal App Modernization Practice.
Organizations are shifting away from a traditional siloed 3-layer architecture and batch processing to a streaming-first approach.Event streaming paradigms as seen in Kafka-based architecture are becoming critical for such migrations. In this session we will give a comprehensive overview of an innovative process developed for event streaming and event modeling. One of the applications of the process was in the retail vertical to reduce the real-time inventory data drift helping retailers save dollars in the supply chain by reducing inventory excess by minimizing out-of-stock events. We will cover Event Storming, Event Modeling, Data Landscape analysis, schema organization, tactical patterns for event data migration like Event Shunting. We will explain why the Event Stream Data Engineering Modeling - maturity model is critical for success for any event driven stream modernization
The push model of event-streaming architecture lays bare an exercise in data modeling that is fundamentally different from traditional relational data model and entity mapping. And yet some of the lessons and practices of the past need to inform our approach in modernizing the data system architecture to allow for resilient, maintainable and scalable applications. In this session we will extrapolate a set of best practices from industry use cases and explain how Kafka Streams can be used to implement stateful event-streaming architectures on a cloud native platform.
We will dive deep into data modeling and the architecture design to achieve data consistency and correctness while handling the scale and resiliency needs of industry applications. The implemented solution presented utilizes Spring Boot, Kafka Streams and Cassandra. Together, Spring Framework and Kafka provide a powerful combination of developer abstractions that enable builders to devise solutions to previously intractable problems in an elegant succinct fashion with maximum return to the business. We will also cover the real time challenges we have seen with Kafka Streams and KTables in archetypal solutions covering this problem domain. The take away for the technical audience is how best to identify problems addressed by streaming data platforms, and show them how to empower developers in their organization to solve them.