Sorry for the confusion, I do be able to reproduce this issue. I think this
is a bug of SparkRInterpreter
Manuel Sopena Ballesteros 于2019年11月13日周三 下午2:48写道:
> Sorry, I got confused with the terminology (I meant paragraph instead of
> note)
>
>
>
> My interpreter is configured per user +isolate
Sorry, I got confused with the terminology (I meant paragraph instead of note)
My interpreter is configured per user +isolated --> this means the same
interpreter process (jvm process) for same user.
First paragraph
%anaconda3.r
setwd("/home/mansop")
getwd()
output:
[1] "/home/mansop"
Second
t; Manuel Sopena Ballesteros 于2019年11月13日周三 下午1:50
> 写道:
>
> Yarn cluster using impersonate (per user + isolated)
>
>
>
> I guess that means each note use different interpreters?
>
>
>
> Manuel
>
>
>
> *From:* Jeff Zhang [mailto:zjf...@gmail.com]
> *Sent:* We
Ok, what should I do in order to be able to reuse variables across different
notes?
Manuel
From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: Wednesday, November 13, 2019 4:57 PM
To: users
Subject: Re: spark r interpreter resets working directory
In that case, each user use different interpreter
uess that means each note use different interpreters?
>
>
>
> Manuel
>
>
>
> *From:* Jeff Zhang [mailto:zjf...@gmail.com]
> *Sent:* Wednesday, November 13, 2019 2:35 PM
> *To:* users
> *Subject:* Re: spark r interpreter resets working directory
>
>
>
> Does
Yarn cluster using impersonate (per user + isolated)
I guess that means each note use different interpreters?
Manuel
From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: Wednesday, November 13, 2019 2:35 PM
To: users
Subject: Re: spark r interpreter resets working directory
Does your different
Does your different notes share the same interpreter ? I suspect you are
using per note isolated or scoped mode.
Looks like you are local or yarn-client mode for the first note, but using
yarn-cluster mode for the second note
Manuel Sopena Ballesteros 于2019年11月13日周三
上午11:31写道:
> Dear Zeppelin c