ent: Thursday, October 23, 2014 11:32 AM
> To: Shao, Saisai
> Cc: arthur.hk.c...@gmail.com; user
> Subject: Re: Spark Hive Snappy Error
>
> Hi,
>
> Please find the attached file.
>
>
>
> my spark-default.xml
> # Default system properties included w
rthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Thursday, October 23, 2014 11:32 AM
To: Shao, Saisai
Cc: arthur.hk.c...@gmail.com; user
Subject: Re: Spark Hive Snappy Error
Hi,
Please find the attached file.
my spark-default.xml
# Default system properties included when running sp
Hi
May I know where to configure Spark to load libhadoop.so?
Regards
Arthur
On 23 Oct, 2014, at 11:31 am, arthur.hk.c...@gmail.com
wrote:
> Hi,
>
> Please find the attached file.
>
>
>
>
> my spark-default.xml
> # Default system properties included when running spark-submit.
> # This is
Hi,Please find the attached file.{\rtf1\ansi\ansicpg1252\cocoartf1265\cocoasubrtf210
{\fonttbl\f0\fnil\fcharset0 Menlo-Regular;}
{\colortbl;\red255\green255\blue255;}
\paperw11900\paperh16840\margl1440\margr1440\vieww26300\viewh12480\viewkind0
\pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\t
arthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
> Sent: Wednesday, October 22, 2014 8:35 PM
> To: Shao, Saisai
> Cc: arthur.hk.c...@gmail.com; user
> Subject: Re: Spark Hive Snappy Error
>
> Hi,
>
> Yes, I can always reproduce the issue:
>
> about you wor
: Re: Spark Hive Snappy Error
Hi,
Yes, I can always reproduce the issue:
about you workload, Spark configuration, JDK version and OS version?
I ran SparkPI 1000
java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server
]
> Sent: Friday, October 17, 2014 7:13 AM
> To: user
> Cc: arthur.hk.c...@gmail.com
> Subject: Spark Hive Snappy Error
>
> Hi,
>
> When trying Spark with Hive table, I got the “java.lang.UnsatisfiedLinkError:
> org.xerial.snappy.SnappyNative.maxCompressedLeng
configuration, JDK version and OS version?
Thanks
Jerry
From: arthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Friday, October 17, 2014 7:13 AM
To: user
Cc: arthur.hk.c...@gmail.com
Subject: Spark Hive Snappy Error
Hi,
When trying Spark with Hive table, I got the
Hi,
When trying Spark with Hive table, I got the “java.lang.UnsatisfiedLinkError:
org.xerial.snappy.SnappyNative.maxCompressedLength(I)I” error,
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql(“select count(1) from q8_national_market_share
sqlContext.sql("select co