got it! thanks a lot.
On Fri, Sep 11, 2015 at 11:10 AM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:
> Its possible -- in the sense that a lot of designs are possible. But
> AFAIK there are no clean interfaces for getting all the arguments /
> SparkConf options from spark-submit and
Its possible -- in the sense that a lot of designs are possible. But
AFAIK there are no clean interfaces for getting all the arguments /
SparkConf options from spark-submit and its all the more tricker to
handle scenarios where the first JVM has already created a
SparkContext that you want to use f
forgot to reply all.
I see. but what prevents e.g. R driver getting those command line arguments
from spark-submit and setting them with SparkConf to R diver's
in-process JVM through JNI?
On Thu, Sep 10, 2015 at 9:29 PM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:
> Yeah in additi
The in-process JNI only works out when the R process comes up first
and we launch a JVM inside it. In many deploy modes like YARN (or
actually in anything using spark-submit) the JVM comes up first and we
launch R after that. Using an inter-process solution helps us cover
both use cases
Thanks
Shi
why SparkR chose to uses inter-process socket solution eventually on driver
side instead of in-process JNI showed in one of its doc's below (about page
20)?
https://spark-summit.org/wp-content/uploads/2014/07/SparkR-Interactive-R-Programs-at-Scale-Shivaram-Vankataraman-Zongheng-Yang.pdf