I am running the same version of spark in the server (master + worker) and
in the client / driver.

For the server I am using the binaries spark-1.1.0-bin-hadoop1
And in the client I am using the same version:

        <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-core_2.10</artifactId>
>             <version>1.1.0</version>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-streaming_2.10</artifactId>
>             <version>1.1.0</version>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-streaming-twitter_2.10</artifactId>
>             <version>1.1.0</version>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-examples_2.10</artifactId>
>             <version>1.1.0</version>
>         </dependency>




On Wed, Nov 5, 2014 at 6:32 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Its more like you are having different versions of spark
>
> Thanks
> Best Regards
>
> On Wed, Nov 5, 2014 at 3:05 AM, Saiph Kappa <saiph.ka...@gmail.com> wrote:
>
>> I set the host and port of the driver and now the error slightly changed
>>
>> Using Spark's default log4j profile:
>>> org/apache/spark/log4j-defaults.properties
>>> 14/11/04 21:13:48 INFO CoarseGrainedExecutorBackend: Registered signal
>>> handlers for [TERM, HUP, INT]
>>> 14/11/04 21:13:48 INFO SecurityManager: Changing view acls to:
>>> myuser,Myuser
>>> 14/11/04 21:13:48 INFO SecurityManager: Changing modify acls to:
>>> myuser,Myuser
>>> 14/11/04 21:13:48 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users with view permissions: Set(myuser,
>>> Myuser); users with modify permissions: Set(myuser, Myuser)
>>> 14/11/04 21:13:48 INFO Slf4jLogger: Slf4jLogger started
>>> 14/11/04 21:13:48 INFO Remoting: Starting remoting
>>> 14/11/04 21:13:49 INFO Remoting: Remoting started; listening on
>>> addresses :[akka.tcp://driverPropsFetcher@myserver:37456]
>>> 14/11/04 21:13:49 INFO Remoting: Remoting now listens on addresses:
>>> [akka.tcp://driverPropsFetcher@myserver:37456]
>>> 14/11/04 21:13:49 INFO Utils: Successfully started service
>>> 'driverPropsFetcher' on port 37456.
>>> 14/11/04 21:14:19 ERROR UserGroupInformation: PriviledgedActionException
>>> as:Myuser cause:java.util.concurrent.TimeoutException: Futures timed out
>>> after [30 seconds]
>>> Exception in thread "main"
>>> java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
>>>     at
>>> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
>>>     at
>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
>>>     at
>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:156)
>>>     at
>>> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
>>> Caused by: java.security.PrivilegedActionException:
>>> java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>     ... 4 more
>>> Caused by: java.util.concurrent.TimeoutException: Futures timed out
>>> after [30 seconds]
>>>     at
>>> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>>>     at
>>> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>>>     at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>>>     at
>>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>>>     at scala.concurrent.Await$.result(package.scala:107)
>>>     at
>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:125)
>>>     at
>>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
>>>     at
>>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
>>>     ... 7 more
>>>
>>
>> Any ideas?
>>
>> Thanks.
>>
>> On Tue, Nov 4, 2014 at 11:29 AM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> If you want to run the spark application from a remote machine, then you
>>> have to at least set the following configurations properly.
>>>
>>> *spark.driver.host* - points to the ip/host from where you are
>>> submitting the job (make sure you are able to ping this from the cluster)
>>>
>>> *spark.driver.port* - set it to a port number which is accessible from
>>> the spark cluster.
>>>
>>> You can look at more configuration options over here.
>>> <http://spark.apache.org/docs/latest/configuration.html#networking>
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Nov 4, 2014 at 6:07 AM, Saiph Kappa <saiph.ka...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am trying to submit a job to a spark cluster running on a single
>>>> machine (1 master + 1 worker) with hadoop 1.0.4. I submit it in the code:
>>>> «val sparkConf = new
>>>> SparkConf().setMaster("spark://myserver:7077").setAppName("MyApp").setJars(Array("target/my-app-1.0-SNAPSHOT.jar"))».
>>>>
>>>> When I run this application on the same machine as the cluster
>>>> everything works fine.
>>>>
>>>> But when I run it from a remote machine I get the following error:
>>>>
>>>> Using Spark's default log4j profile:
>>>>> org/apache/spark/log4j-defaults.properties
>>>>> 14/11/04 00:15:38 INFO CoarseGrainedExecutorBackend: Registered signal
>>>>> handlers for [TERM, HUP, INT]
>>>>> 14/11/04 00:15:38 INFO SecurityManager: Changing view acls to:
>>>>> myuser,Myuser
>>>>> 14/11/04 00:15:38 INFO SecurityManager: Changing modify acls to:
>>>>> myuser,Myuser
>>>>> 14/11/04 00:15:38 INFO SecurityManager: SecurityManager:
>>>>> authentication disabled; ui acls disabled; users with view permissions:
>>>>> Set(myuser, Myuser); users with modify permissions: Set(myuser, Myuser)
>>>>> 14/11/04 00:15:38 INFO Slf4jLogger: Slf4jLogger started
>>>>> 14/11/04 00:15:38 INFO Remoting: Starting remoting
>>>>> 14/11/04 00:15:38 INFO Remoting: Remoting started; listening on
>>>>> addresses :[akka.tcp://driverPropsFetcher@myserver:49190]
>>>>> 14/11/04 00:15:38 INFO Remoting: Remoting now listens on addresses:
>>>>> [akka.tcp://driverPropsFetcher@myserver:49190]
>>>>> 14/11/04 00:15:38 INFO Utils: Successfully started service
>>>>> 'driverPropsFetcher' on port 49190.
>>>>> 14/11/04 00:15:38 WARN Remoting: Tried to associate with unreachable
>>>>> remote address [akka.tcp://sparkDriver@mylaptop:57418]. Address is
>>>>> now gated for 60000 ms, all messages to this address will be delivered to
>>>>> dead letters.
>>>>> 14/11/04 00:16:08 ERROR UserGroupInformation:
>>>>> PriviledgedActionException as:Myuser
>>>>> cause:java.util.concurrent.TimeoutException: Futures timed out after [30
>>>>> seconds]
>>>>> Exception in thread "main"
>>>>> java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
>>>>>     at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
>>>>>     at
>>>>> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
>>>>>     at
>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
>>>>>     at
>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:156)
>>>>>     at
>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
>>>>> Caused by: java.security.PrivilegedActionException:
>>>>> java.util.concurrent.TimeoutException: Futures timed out after [30 
>>>>> seconds]
>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>     at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>>     ... 4 more
>>>>> Caused by: java.util.concurrent.TimeoutException: Futures timed out
>>>>> after [30 seconds]
>>>>>     at
>>>>> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>>>>>     at
>>>>> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>>>>>     at
>>>>> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>>>>>     at
>>>>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>>>>>     at scala.concurrent.Await$.result(package.scala:107)
>>>>>     at
>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:125)
>>>>>     at
>>>>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
>>>>>     at
>>>>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
>>>>>     ... 7 more
>>>>>
>>>>
>>>> I know this has something to do with hadoop permissions. I have checked
>>>> and all necessary hadoop ports in the server are open and accessible from
>>>> outside.
>>>>
>>>> How can I configure the right permissions?
>>>>
>>>> Thanks.
>>>>
>>>
>>>
>>
>

Reply via email to