Scala has not yet resolved this issue. Once they fix and release a new
version, you can just upgrade the Scala version by yourself.
On Tue, Nov 22, 2016 at 10:58 PM, Denis Bolshakov wrote:
> Hello Zhu,
>
> Thank you very much for such detailed explanation and providing
> workaround, it works fin
Hello Zhu,
Thank you very much for such detailed explanation and providing workaround,
it works fine.
But since the problem is related to scala issue can we expect the fix in
Spark 2.0? Or it's not a good idea to update such important dependency as
scala in minor maintenance release?
Kind regard
The workaround is defining the imports and class together using ":paste".
On Tue, Nov 22, 2016 at 11:12 AM, Shixiong(Ryan) Zhu <
shixi...@databricks.com> wrote:
> This relates to a known issue: https://issues.apache.
> org/jira/browse/SPARK-14146 and https://issues.scala-lang.
> org/browse/SI-979
This relates to a known issue:
https://issues.apache.org/jira/browse/SPARK-14146 and
https://issues.scala-lang.org/browse/SI-9799
On Tue, Nov 22, 2016 at 6:37 AM, dbolshak wrote:
> Hello,
>
> We have the same issue,
>
> We use latest release 2.0.2.
>
> Setup with 1.6.1 works fine.
>
> Could some
Hello,
We have the same issue,
We use latest release 2.0.2.
Setup with 1.6.1 works fine.
Could somebody provide a workaround how to fix that?
Kind regards,
Denis
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Pasting-into-spark-shell-doesn-t-work-for-D
Hello,
We have the same issue,
We use latest release 2.0.2.
Setup with 1.6.1 works fine.
Could somebody provide a workaround how to fix that?
Kind regards,
Denis
On 21 November 2016 at 20:23, jggg777 wrote:
> I'm simply pasting in the UDAF example from this page and getting errors
> (basic
I'm simply pasting in the UDAF example from this page and getting errors
(basic EMR setup with Spark 2.0):
https://docs.cloud.databricks.com/docs/latest/databricks_guide/index.html#04%20SQL,%20DataFrames%20%26%20Datasets/03%20UDF%20and%20UDAF%20-%20scala.html
The imports appear to work, but then I