No luck.
But two updates:
1. i have downloaded spark-1.4.1 and everything works fine, i dont see any
error
2. i have added the following XML file to spark's 1.5.2  conf directory and
now i got the following error

aused by: java.lang.RuntimeException: The root scratch dir:
c:/Users/marco/tmp on HDFS should be writable. Current permissions are:
rwx---rwx

I will have to play around with windows permissions to allow spark to use
that directory

kr
 marco




On Sun, Dec 20, 2015 at 5:15 PM, Marco Mistroni <mmistr...@gmail.com> wrote:

> Thanks Chris will give it a go and report back.
> Bizarrely if I start the pyspark shell I don't see any issues
> Kr
> Marco
> On 20 Dec 2015 5:02 pm, "Chris Fregly" <ch...@fregly.com> wrote:
>
>> hopping on a plane, but check the hive-site.xml that's in your spark/conf
>> directory (or should be, anyway).  I believe you can change the root path
>> thru this mechanism.
>>
>> if not, this should give you more info google on.
>>
>> let me know as this comes up a fair amount.
>>
>> > On Dec 19, 2015, at 4:58 PM, Marco Mistroni <mmistr...@gmail.com>
>> wrote:
>> >
>> > HI all
>> >  posting again this as i was experiencing this error also under 1.5.1
>> > I am running spark 1.5.2 on a Windows 10 laptop (upgraded from Windows
>> 8)
>> > When i launch spark-shell i am getting this exception, presumably
>> becaus ei hav eno
>> > admin right to /tmp directory on my latpop (windows 8-10 seems very
>> restrictive)
>> >
>> > java.lang.RuntimeException: java.lang.RuntimeException: The root
>> scratch dir: /tmp/hive on HDFS should be writable. Current permissions are:
>> ---------
>> >         at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>> >         at
>> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
>> >         at
>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
>> >         at
>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
>> >         at
>> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
>> >         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> >         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >         at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >         at
>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>> >         at $iwC$$iwC.<init>(<console>:9)
>> >         at $iwC.<init>(<console>:18)
>> >         at <init>(<console>:20)
>> >         at .<init>(<console>:24)
>> >         at .<clinit>(<console>)
>> >         at .<init>(<console>:7)
>> >         at .<clinit>(<console>)
>> >         at $print(<console>)
>> > ......
>> > Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive
>> on HDFS should be writable. Current permissions are: ---------
>> >         at
>> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
>> >         at
>> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>> >         at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
>> >         ... 56 more
>> >
>> > <console>:10: error: not found: value sqlContext
>> >        import sqlContext.implicits._
>> >               ^
>> > <console>:10: error: not found: value sqlContext
>> >        import sqlContext.sql
>> >
>> > I was wondering how can i configure hive to point to a different
>> directory....where i have more permissions
>> >
>> > kr
>> >  marco
>> >
>>
>

Reply via email to