That helps a lot.
Thanks.
Zhanfeng Huo
From: Davies Liu
Date: 2014-08-18 14:31
To: ryaminal
CC: u...@spark.incubator.apache.org
Subject: Re: application as a service
Another option is using Tachyon to cache the RDD, then the cache can
be shared by different applications. See how to use
context:
> http://apache-spark-user-list.1001560.n3.nabble.com/application-as-a-service-tp12253p12267.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-u
Thank you Eugen Cepoi, I will try it now.
Zhanfeng Huo
From: Eugen Cepoi
Date: 2014-08-17 23:34
To: Zhanfeng Huo
CC: user
Subject: Re: application as a service
Hi,
You can achieve it by running a spray service for example that has access to
the RDD in question. When starting the app you
pache-spark-user-list.1001560.n3.nabble.com/application-as-a-service-tp12253p12267.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For addit
Hi,
You can achieve it by running a spray service for example that has access
to the RDD in question. When starting the app you first build your RDD and
cache it. In your spray "endpoints" you will translate the HTTP requests to
operations on that RDD.
2014-08-17 17:27 GMT+02:00 Zhanfeng Huo :
Hi, All:
I have a demand that using spark load business data daily and cache it as
rdd or spark sql rdd. And other users can query base on it (in memery). As a
summary, it requires that the app must runing as a deamon service that can last
for one day at least and user's app can access