Hi Nick,
Thanks for the answer. Do you think an implementation like the one in this
article is infeasible in production for say, hundreds of queries per
minute?
https://www.codementor.io/spark/tutorial/building-a-web-service-with-apache-spark-flask-example-app-part2.
The article uses Flask to defi
Hi all,
How do you reliably deploy a spark model in production? Let's say I've done
a lot of analysis and come up with a model that performs great. I have this
"model file" and I'm not sure what to do with it. I want to build some kind
of service around it that takes some inputs, converts them int
rne
wrote:
> Thanks Saurabh!
>
> That explode function looks like it is exactly what I need.
>
> We will be using MLlib quite a lot - Do I have to worry about python
> versions for that?
>
> John
>
> On Wed, Jun 22, 2016 at 4:34 PM, Saurabh Sardeshpande <
> saurabh..
Hi John,
If you can do it in Hive, you should be able to do it in Spark. Just make
sure you import HiveContext instead of SQLContext.
If your intent is to explore rather than get stuff done, I've not aware of
any RDD operations that do this for you, but there is a DataFrame operation
called 'expl