Session
End to End Data Traceability Using Kafka Interceptors
Data is the most critical asset to deliver value to our customers and hence we care to invest on the quality and the tracing. In the bigdata context, the data tracing challenges are multi-folded due to the data size, the data transformations, the cost and the maturity of the tools.
In this session, I am going to discuss the core concepts of Kafka interceptors and how it is leveraged to build an enterprise-wide library to trace the billions of events that are being processed by our data pipelines. This reusable library has been architected to be used in all our micro-services to trace the data, while the data traverse from various sources into Kafka topics, processed as streams, and computation of various metrics and eventually reach to a persistent storage. The tracing data itself is persisted to Elastic for further interpretations as visual dashboards.
The solution especially enables us to trace the complete lifecycle of the data.

Raghavi Janaswamy
Sr. Principal Engineer at Optum, Research Scholar where technology meets fine arts
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top