Hello Shivaram,
Thank you for your response. Being a novice at this stage can you also tell
how to configure or set the execute permission for the spark-submit file?

Thank you for your time.


Sincerely,
Ashish Dutt

On Tue, Jul 7, 2015 at 9:21 AM, Shivaram Venkataraman <
[email protected]> wrote:

> When I've seen this error before it has been due to the spark-submit file
> (i.e. `C:\spark-1.4.0\bin/bin/spark-submit.cmd`) not having execute
> permissions. You can try to set execute permission and see if it fixes
> things.
>
> Also we have a PR open to fix a related problem at
> https://github.com/apache/spark/pull/7025. If you can test the PR that
> will also be very helpful
>
> Thanks
> Shivaram
>
> On Mon, Jul 6, 2015 at 6:11 PM, ashishdutt <[email protected]> wrote:
>
>> Hi,
>>
>> I am trying to connect a worker to the master. The spark master is on
>> cloudera manager and I know the master IP address and port number.
>> I downloaded the spark binary for CDH4 on the worker machine and then
>> when I
>> try to invoke the command
>> > sc = sparkR.init("master="ip address:port number") I get the following
>> > error.
>>
>> > sc=sparkR.init(master="spark://10.229.200.250:7377")
>> Launching java with spark-submit command
>> C:\spark-1.4.0\bin/bin/spark-submit.cmd  sparkr-shell
>> C:\Users\ASHISH~1\AppData\Local\Temp\Rtmp82kCxH\backend_port4281739d85
>> Error in sparkR.init(master = "spark://10.229.200.250:7377") :
>>   JVM is not ready after 10 seconds
>> In addition: Warning message:
>> running command '"C:\spark-1.4.0\bin/bin/spark-submit.cmd"  sparkr-shell
>> C:\Users\ASHISH~1\AppData\Local\Temp\Rtmp82kCxH\backend_port4281739d85'
>> had
>> status 127
>>
>>
>> I am using windows 7 as the OS on the worker machine and I am invoking the
>> sparkR.init() from RStudio
>>
>> Any help in this reference will be appreciated
>>
>> Thank you,
>> Ashish Dutt
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/JVM-is-not-ready-after-10-seconds-tp23658.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>
>

Reply via email to