Hi,
These are the settings for my spark-conf file on the worker machine from
where I am trying to access the spark server. I think I need to first
configure the spark-submit file too but I do not know how,, Can somebody
suggest me ....?
# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings.

# Example:
spark.master                     10.229.200.250:7337
# spark.eventLog.enabled           true
spark.eventLog.dir               hdfs://namenode:8021/directory
# spark.serializer
org.apache.spark.serializer.KryoSerializer
# spark.driver.memory              5g
# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value
-Dnumbers="one two three"


Sincerely,
Ashish Dutt

On Tue, Jul 7, 2015 at 9:30 AM, Ashish Dutt <ashish.du...@gmail.com> wrote:

> Hello Shivaram,
> Thank you for your response. Being a novice at this stage can you also
> tell how to configure or set the execute permission for the spark-submit
> file?
>
> Thank you for your time.
>
>
> Sincerely,
> Ashish Dutt
>
> On Tue, Jul 7, 2015 at 9:21 AM, Shivaram Venkataraman <
> shiva...@eecs.berkeley.edu> wrote:
>
>> When I've seen this error before it has been due to the spark-submit file
>> (i.e. `C:\spark-1.4.0\bin/bin/spark-submit.cmd`) not having execute
>> permissions. You can try to set execute permission and see if it fixes
>> things.
>>
>> Also we have a PR open to fix a related problem at
>> https://github.com/apache/spark/pull/7025. If you can test the PR that
>> will also be very helpful
>>
>> Thanks
>> Shivaram
>>
>> On Mon, Jul 6, 2015 at 6:11 PM, ashishdutt <ashish.du...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I am trying to connect a worker to the master. The spark master is on
>>> cloudera manager and I know the master IP address and port number.
>>> I downloaded the spark binary for CDH4 on the worker machine and then
>>> when I
>>> try to invoke the command
>>> > sc = sparkR.init("master="ip address:port number") I get the following
>>> > error.
>>>
>>> > sc=sparkR.init(master="spark://10.229.200.250:7377")
>>> Launching java with spark-submit command
>>> C:\spark-1.4.0\bin/bin/spark-submit.cmd  sparkr-shell
>>> C:\Users\ASHISH~1\AppData\Local\Temp\Rtmp82kCxH\backend_port4281739d85
>>> Error in sparkR.init(master = "spark://10.229.200.250:7377") :
>>>   JVM is not ready after 10 seconds
>>> In addition: Warning message:
>>> running command '"C:\spark-1.4.0\bin/bin/spark-submit.cmd"  sparkr-shell
>>> C:\Users\ASHISH~1\AppData\Local\Temp\Rtmp82kCxH\backend_port4281739d85'
>>> had
>>> status 127
>>>
>>>
>>> I am using windows 7 as the OS on the worker machine and I am invoking
>>> the
>>> sparkR.init() from RStudio
>>>
>>> Any help in this reference will be appreciated
>>>
>>> Thank you,
>>> Ashish Dutt
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/JVM-is-not-ready-after-10-seconds-tp23658.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to