You need not require admin permission, but just make sure all those jars
has execute permission ( read/write access)

Thanks
Best Regards

On Thu, Feb 19, 2015 at 11:30 AM, Judy Nash <judyn...@exchange.microsoft.com
> wrote:

>  Hi,
>
>
>
> Is it possible to configure spark to run without admin permission on
> windows?
>
>
>
> My current setup run master & slave successfully with admin permission.
>
> However, if I downgrade permission level from admin to user, SparkPi fails
> with the following exception on the slave node:
>
> Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due to s
>
> tage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
> Lost task
>
> 0.3 in stage 0.0 (TID 9,
> workernode0.jnashsparkcurr2.d10.internal.cloudapp.net)
>
> : java.lang.ClassNotFoundException:
> org.apache.spark.examples.SparkPi$$anonfun$1
>
>
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>         at java.lang.Class.forName0(Native Method)
>
>         at java.lang.Class.forName(Class.java:270)
>
>
>
> Upon investigation, it appears that sparkPi jar under
> spark_home\worker\appname\*.jar does not have execute permission set,
> causing spark not able to find class.
>
>
>
> Advice would be very much appreciated.
>
>
>
> Thanks,
>
> Judy
>
>
>

Reply via email to