Hi Stan,
I have filed a issue on JIRA for this exception, and I have tested it on
spark 1.3.0 (local mode) without exception.
https://issues.apache.org/jira/browse/SPARK-8368
On Sun, Jun 14, 2015 at 11:25 AM, StanZhai wrote:
> I have encountered the similar error too at spark 1.4.0.
>
> The sa
Hi all,
I encounter an error at spark 1.4.0, and I make an error example as
following. Both of the code can run OK on spark-shell, but the second code
encounter an error using spark-submit. The only different is that the
second code uses a literal function in the map(). but the first code uses a
d
Hi all,
I just upgraded spark from 1.2.1 to 1.3.0, and changed the "import
sqlContext.createSchemaRDD" to "import sqlContext.implicits._" in my code.
(I scan the programming guide and it seems this is the only change I need
to do). But it come to an error when run compile as following:
>>>
[ERRO
Hi everyone,
I am newly to spark, and try to package the spark-core for some
modification. I use IDEA to package the spark-core_2.10 of spark 1.1.1.
When encounter the following error, I check the website
http://www.scalastyle.org/maven.html, and its suggest configuration is to
modify the spark