Session

Changing landscapes in data integration - Kafka Connect for near real-time data moving pipelines.

In 2019 we presented “Secure Kafka at scale in true multi-tenant environment” at SFO Kafka summit. Back then, kafka was mainly used for event driven architectures, high-throughput pub/sub use cases and as a data-plane for log aggregation and for transporting metadata & metrics. A lot has changed since then - Kafka plant has grown to handle 400B incoming events in a day just in production, introduced stretch cluster pattern in addition to Active-Active cluster replication pattern. Moreover, new use cases have emerged in using Kafka for near real time stream processing and data moving pipelines across the cloud environments. Moving data in near-real time across the system is a hard problem to solve. Kafka Connect is a framework to stream data in/out of kafka reliably and can be used to achieve near-real time data moving pipelines. In this talk, we will present how kafka adoption has evolved over the last couple of years in our space and deep dive into how we approached in providing Managed Kafka Connect, a newest addition to our service portfolio.

Ashok Kadambala

Engineering & Architecture lead - Streaming & Integration, JP Morgan Chase

New York City, New York, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top