ent: Thursday, October 23, 2014 11:32 AM
> To: Shao, Saisai
> Cc: arthur.hk.c...@gmail.com; user
> Subject: Re: Spark Hive Snappy Error
>
> Hi,
>
> Please find the attached file.
>
>
>
> my spark-default.xml
> # Default system properties included w
rthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Thursday, October 23, 2014 11:32 AM
To: Shao, Saisai
Cc: arthur.hk.c...@gmail.com; user
Subject: Re: Spark Hive Snappy Error
Hi,
Please find the attached file.
my spark-default.xml
# Default system properties included when running sp
Hi
May I know where to configure Spark to load libhadoop.so?
Regards
Arthur
On 23 Oct, 2014, at 11:31 am, arthur.hk.c...@gmail.com
wrote:
> Hi,
>
> Please find the attached file.
>
>
>
>
> my spark-default.xml
> # Default system properties included when running spark-submit.
> # This is
Hi,Please find the attached file.{\rtf1\ansi\ansicpg1252\cocoartf1265\cocoasubrtf210
{\fonttbl\f0\fnil\fcharset0 Menlo-Regular;}
{\colortbl;\red255\green255\blue255;}
\paperw11900\paperh16840\margl1440\margr1440\vieww26300\viewh12480\viewkind0
\pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\t
arthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
> Sent: Wednesday, October 22, 2014 8:35 PM
> To: Shao, Saisai
> Cc: arthur.hk.c...@gmail.com; user
> Subject: Re: Spark Hive Snappy Error
>
> Hi,
>
> Yes, I can always reproduce the issue:
>
> about you wor
: Re: Spark Hive Snappy Error
Hi,
Yes, I can always reproduce the issue:
about you workload, Spark configuration, JDK version and OS version?
I ran SparkPI 1000
java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server
Hi,
Yes, I can always reproduce the issue:
> about you workload, Spark configuration, JDK version and OS version?
I ran SparkPI 1000
java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
cat /e
Hi Arthur,
I think this is a known issue in Spark, you can check
(https://issues.apache.org/jira/browse/SPARK-3958). I’m curious about it, can
you always reproduce this issue, Is this issue related to some specific data
sets, would you mind giving me some information about you workload, Spark