Session

Model inference in Flink SQL using a custom HTTP connector

Model inference in Flink SQL can be done in several ways. E.g. the SQL syntax can be extended by implementing a predict function using a UDF, a bit similar to what Google has done with Bigquery-ML.

However, UDF’s are synchronous so for use cases with high throughput requirements the preference is to have an a- synchronous solution like calling an endpoint that serves the model.

In this presentation we present our solution that abstracts the endpoint that serves the model as a table by implementing a custom HTTP connector. This enables our users to do model inferencing by simply writing a SQL join.

To feed the model with a feature set we perform an additional temporal join to get the feature set from a retract stream based on an underlying compacted topic which enables our user to also update features in realtime.

Erik de Nooij

Engineering lead Streaming Data Analytics

Amsterdam, The Netherlands

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top