Hi,
You can pass variable from between spark and pyspark in this way.
%spark
z.put("var1", myVar)
%pyspark
myVar = z.get("var1")
The other way (pyspark->spark) also works.
For SparkSQL, one possible way to use variable is
%spark
z.show(sqlContext.sql(s"select .... ${myVar}"))
Hope this helps.
Thanks,
moon
On Thu, Jul 23, 2015 at 2:36 AM TEJA SRIVASTAV <[email protected]>
wrote:
> Hello all,
>
> Is it possible to pass different variables or different interpreters
> thoroughout a notebook
>
> Like i should be able to access the variables between the paragraph use
> the data for visualization in d3 via anuglar or scala
>
> and then pass those variables for processing in SQL or python
>
>
> --
> With regards,
> Teja Srivastav
> +91 9066 82 32 80
>