HI still having issues in installing spark on windows 8 the spark web console runs successfully.. i can run spark pi example, however wheni run spark-shell i am getting the following exception
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /t mp/hive on HDFS should be writable. Current permissions are: --------- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav a:522) at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.s cala:171) at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveCo ntext.scala:162) at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala :160) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) i have amended the permisisons to full access for my account on windows 8... i am trying to understand if this exception can be ablocker if i decide to submt tasks to spark given the fact that i can run spark example, this does no tlook like a blocker but seeing exceptions when i launch the spark shell does not make me feel comfortable, expecially if i dont understand why i am having exception doesn't spark like windows 8? any suggestions appreciated kind regards marco On Thu, Oct 15, 2015 at 11:40 PM, Marco Mistroni <mmistr...@gmail.com> wrote: > Hi > i t ried to set this variable in my windows env variables but got same > result > this si the result of calling set in my command prompt > have i amended it in the wrong place? > > kr > marco > ...... > USERDOMAIN=MarcoLaptop > USERDOMAIN_ROAMINGPROFILE=MarcoLaptop > USERNAME=marco > USERPROFILE=C:\Users\marco > windir=C:\Windows > _JAVA_OPTIONS=-Djava.net.preferIPv4Stack=true > > > On Thu, Oct 15, 2015 at 1:25 AM, Raghavendra Pandey < > raghavendra.pan...@gmail.com> wrote: > >> Looks like you are facing ipv6 issue. Can you try using preferIPv4 >> property on. >> On Oct 15, 2015 2:10 AM, "Steve Loughran" <ste...@hortonworks.com> wrote: >> >>> >>> On 14 Oct 2015, at 20:56, Marco Mistroni <mmistr...@gmail.com> wrote: >>> >>> >>> 15/10/14 20:52:35 WARN : Your hostname, MarcoLaptop resolves to a >>> loopback/non-r >>> eachable address: fe80:0:0:0:c5ed:a66d:9d95:5caa%wlan2, but we couldn't >>> find any >>> external IP address! >>> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch >>> dir: /t >>> mp/hive on HDFS should be writable. Current permissions are: --------- >>> at >>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav >>> a:522) >>> at >>> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.s >>> cala:171) >>> at >>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveCo >>> >>> >>> now, that I haven't seen. Looks like it thinks the permissions are >>> wrong, doesn't it? >>> >> >