The first run actually worked. It was the amount of exceptions preceding
the result that surprised me.

I want to see if there is a way of getting rid of the exceptions.

Thanks

On Wed, Dec 16, 2015 at 10:53 AM, Jakob Odersky <joder...@gmail.com> wrote:

> When you re-run the last statement a second time, does it work? Could it
> be related to https://issues.apache.org/jira/browse/SPARK-12350 ?
>
> On 16 December 2015 at 10:39, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> Hi,
>> I used the following command on a recently refreshed checkout of master
>> branch:
>>
>> ~/apache-maven-3.3.3/bin/mvn -Phive -Phive-thriftserver -Pyarn
>> -Phadoop-2.4 -Dhadoop.version=2.7.0 package -DskipTests
>>
>> I was then running simple query in spark-shell:
>>     Seq(
>>       (83, 0, 38),
>>       (26, 0, 79),
>>       (43, 81, 24)
>>     ).toDF("a", "b", "c").registerTempTable("cachedData")
>>
>>     sqlContext.cacheTable("cachedData")
>>     sqlContext.sql("select * from cachedData").show
>>
>> However, I encountered errors in the following form:
>>
>> http://pastebin.com/QeiwJpwi
>>
>> Under workspace, I found:
>>
>> ./sql/catalyst/target/scala-2.10/classes/org/apache/spark/sql/catalyst/expressions/codegen/GeneratedClass.class
>>
>> but no ByteOrder.class.
>>
>> Did I miss some step(s) ?
>>
>> Thanks
>>
>
>

Reply via email to