Speaker

Gokulnath Chidambaram

Gokulnath Chidambaram

Data Architect

Actions

Gokulnath Chidambaram is a seasoned Data Architect with deep experience in developing Scalable Analtyics Frameworks for running production-grade analytical solutions.
He is very passionate about Confluent Kafka offerings and strongly believes in Jay's assertion of Kafka as the Central nervous system for any Data processing needs and data-centric applications.
His experience in industrial companies like ge and Schneider helped him to understand the Industrial Internet world and how industrial assets are managed and the operational challenges. Strong background in Software development coupled with confluent offerings, he proposes the concept of the Asset Management Pattern that any helps any industrial company can leverage to manage its assets.

Industrial Asset Management with Kafka Streams

Intent

Manage your industrial assets and their life cycle in Simple, Unified, Cost-Effective manner with Confluent Kafka.

Motivation

Today, Asset Management is critical for any business as it helps to reduce unplanned downtime, increase availability, minimize maintenance costs, and reduce risk of failure. It encompasses the capabilities of data capture, data validation, data transformation, various analytics integrations like prediction, real-time health checks, risk analysis model, and visualization. They require not just real-time data from assets, but asset historical data, asset properties, environmental data, etc.. Since these assets are deployed for different customers and if the customers operate in different environments, analytics solutions would also be different for each. In addition, each customer would be having different reporting and dashboards needs too. In the case of Complex assets like Engines, each part of the engine is considered as an asset, and asset hierarchy follows a whole-part relationship where parts determine the life of the whole.
Every feature in Asset management requires complex technology stacks and seamless integrations is a challenging one. Consider a scenario where real-time asset data are stored in NoSQL Databases, Metadata and Properties are stored in Relational/Graph Databases, ETL tools for ingestion of data, Streaming Frameworks and microservices for processing of data, different algorithms for analytics solutions.

Structure

Confluent Kafka Streams helps to reduce the technical debt by simplifying ingestion, integration, transformation, Computational needs ‘all-in-one’ way. Confluent provides connectors for integrations with various data sources, Streams framework for data processing, microservices, and algorithms, Ktables for relational data elements, KSQL for querying the streams, and Ktables, Kubernetes integration for running microservices, streams applications, etc.

The proposed pattern brings all the Kafka elements together and neatly integrates them. Microservices and Analytics Algorithms could be developed as KStream Applications and deployed in the Kubernetes environment leveraging the auto-scaling.

Gokulnath Chidambaram

Data Architect

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top