Dear Cederic,
I did something similar as yours a while ago along this work [1] but I've
always been working within the batch context. I'm also the co-author of
flink-jpmml and, since a flink2pmml model saver library doesn't exist
currently, I'd suggest you a twofold strategy to tackle this problem:
One option (which I haven't tried myself) would be to somehow get the model
into PMML format, and then use https://github.com/FlinkML/flink-jpmml to
score the model. You could either use another machine learning framework to
train the model (i.e., a framework that directly supports PMML export), or
Hi Cederic,
If the model is a simple function, you can just load it and make predictions
using the map/flatMap function in the StreamEnvironment.
But I’m afraid the model trained by Flink-ML should be a “batch job", whose
predict method takes a Dataset as the parameter and outputs another Datas
Hi Cederic,
I am not familiar with SVM or machine learning but I think we can work it
out together.
What problem have you met when you try to implement this function? From my
point of view, we can rebuild the model in the flatMap function and use it
to predict the input data. There are some flatMa
Dear
My name is Cederic Bosmans and I am a masters student at the Ghent
University (Belgium).
I am currently working on my masters dissertation which involves Apache
Flink.
I want to make predictions in the streaming environment based on a model
trained in the batch environment.
I trained my SVM