You must specify -Psparkr when building from source.
> On May 20, 2016, at 08:09, Gayathri Murali <gayathri.m.sof...@gmail.com> 
> wrote:
> 
> That helped! Thanks. I am building from source code and I am not sure what 
> caused the issue with SparkR.
> 
> On Thu, May 19, 2016 at 4:17 PM, Xiangrui Meng <men...@gmail.com 
> <mailto:men...@gmail.com>> wrote:
> We no longer have `SparkRWrappers` in Spark 2.0. So if you are testing the 
> latest branch-2.0, there could be an issue with your SparkR installation. Did 
> you try `R/install-dev.sh`?
> 
> On Thu, May 19, 2016 at 11:42 AM Gayathri Murali <gayathri.m.sof...@gmail.com 
> <mailto:gayathri.m.sof...@gmail.com>> wrote:
> This is on Spark 2.0. I see the following on the unit-tests.log when I run 
> the R/run-tests.sh. This on a single MAC laptop, on the recently rebased 
> master. R version is 3.3.0.
> 
> 16/05/19 11:28:13.863 Executor task launch worker-1 ERROR Executor: Exception 
> in task 0.0 in stage 5186.0 (TID 10370)
> 1384595 org.apache.spark.SparkException: R computation failed with
> 1384596
> 1384597 Execution halted
> 1384598
> 1384599 Execution halted
> 1384600
> 1384601 Execution halted
> 1384602     at org.apache.spark.api.r.RRunner.compute(RRunner.scala:107)
> 1384603     at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> 1384604     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:318)
> 1384605     at org.apache.spark.rdd.RDD.iterator(RDD.scala:282)
> 1384606     at 
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> 1384607     at org.apache.spark.scheduler.Task.run(Task.scala:85)
> 1384608     at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> 1384609     at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 1384610     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 1384611     at java.lang.Thread.run(Thread.java:745)
> 1384612 16/05/19 11:28:13.864 Thread-1 INFO ContextHandler: Stopped 
> o.s.j.s.ServletContextHandler@22f76fa8{/jobs/json,null,UNAVAILABLE}
> 1384613 16/05/19 11:28:13.869 Thread-1 INFO ContextHandler: Stopped 
> o.s.j.s.ServletContextHandler@afe0d9f{/jobs,null,UNAVAILABLE}
> 1384614 16/05/19 11:28:13.869 Thread-1 INFO SparkUI: Stopped Spark web UI at 
> http://localhost:4040 <http://localhost:4040/>
> 1384615 16/05/19 11:28:13.871 Executor task launch worker-4 ERROR Executor: 
> Exception in task 1.0 in stage 5186.0 (TID 10371)
> 1384616 org.apache.spark.SparkException: R computation failed with
> 1384617
> 1384618 Execution halted
> 1384619
> 1384620 Execution halted
> 1384621
> 1384622 Execution halted
> 1384623     at org.apache.spark.api.r.RRunner.compute(RRunner.scala:107)
> 1384624     at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> 1384625     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:318)
> 1384626     at org.apache.spark.rdd.RDD.iterator(RDD.scala:282)
> 1384627     at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.
> t org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> 1384630     at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 1384631     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 1384632     at java.lang.Thread.run(Thread.java:745)
> 1384633 16/05/19 11:28:13.874 nioEventLoopGroup-2-1 INFO DAGScheduler: Job 
> 5183 failed: collect at null:-1, took 0.211674 s
> 1384634 16/05/19 11:28:13.875 nioEventLoopGroup-2-1 ERROR RBackendHandler: 
> collect on 26345 failed
> 1384635 16/05/19 11:28:13.876 Thread-1 INFO DAGScheduler: ResultStage 5186 
> (collect at null:-1) failed in 0.210 s
> 1384636 16/05/19 11:28:13.877 Thread-1 ERROR LiveListenerBus: 
> SparkListenerBus has already stopped! Dropping event 
> SparkListenerStageCompleted(org.apache.spark.scheduler.StageIn        
> fo@413da307)
> 1384637 16/05/19 11:28:13.878 Thread-1 ERROR LiveListenerBus: 
> SparkListenerBus has already stopped! Dropping event 
> SparkListenerJobEnd(5183,1463682493877,JobFailed(org.apache.sp        
> ark.SparkException: Job 5183 cancelled because SparkContext was shut down))
> 1384638 16/05/19 11:28:13.880 dispatcher-event-loop-1 INFO 
> MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
> 1384639 16/05/19 11:28:13.904 Thread-1 INFO MemoryStore: MemoryStore cleared
> 1384640 16/05/19 11:28:13.904 Thread-1 INFO BlockManager: BlockManager stopped
> 1384641 16/05/19 11:28:13.904 Thread-1 INFO BlockManagerMaster: 
> BlockManagerMaster stopped
> 1384642 16/05/19 11:28:13.905 dispatcher-event-loop-0 INFO 
> OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 1384643 16/05/19 11:28:13.909 Thread-1 INFO SparkContext: Successfully 
> stopped SparkContext
> 1384644 16/05/19 11:28:13.910 Thread-1 INFO ShutdownHookManager: Shutdown 
> hook called
> 1384645 16/05/19 11:28:13.911 Thread-1 INFO ShutdownHookManager: Deleting 
> directory 
> /private/var/folders/xy/qc35m0y55vq83dsqzg066_c40000gn/T/spark-dfafdddc-fd25-4eb4-bb1d-565915
>         1c8231
> 
> 
> On Thu, May 19, 2016 at 8:46 AM, Xiangrui Meng <men...@gmail.com 
> <mailto:men...@gmail.com>> wrote:
> Is it on 1.6.x?
> 
> 
> On Wed, May 18, 2016, 6:57 PM Sun Rui <sunrise_...@163.com 
> <mailto:sunrise_...@163.com>> wrote:
> I saw it, but I can’t see the complete error message on it.
> I mean the part after “error in invokingJava(…)”
> 
>> On May 19, 2016, at 08:37, Gayathri Murali <gayathri.m.sof...@gmail.com 
>> <mailto:gayathri.m.sof...@gmail.com>> wrote:
>> 
>> There was a screenshot attached to my original email. If you did not get it, 
>> attaching here again.
>> 
>> On Wed, May 18, 2016 at 5:27 PM, Sun Rui <sunrise_...@163.com 
>> <mailto:sunrise_...@163.com>> wrote:
>> It’s wrong behaviour that head(df) outputs no row
>> Could you send a screenshot displaying whole error message?
>>> On May 19, 2016, at 08:12, Gayathri Murali <gayathri.m.sof...@gmail.com 
>>> <mailto:gayathri.m.sof...@gmail.com>> wrote:
>>> 
>>> I am trying to run a basic example on Interactive R shell and run into the 
>>> following error. Also note that head(df) does not display any rows. Can 
>>> someone please help if I am missing something?
>>> 
>>> <Screen Shot 2016-05-18 at 5.09.29 PM.png>
>>> 
>>>  Thanks
>>> Gayathri
>>> 
>>>  邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
>>> 共有 1 个附件
>>> Screen Shot 2016-05-18 at 5.09.29 PM.png(155K)
>>> 极速下载 
>>> <http://preview.mail.163.com/xdownload?filename=Screen+Shot+2016-05-18+at+5.09.29+PM.png&mid=xtbB0QpumlUL%2BmgE3wAAs4&part=3&sign=de2b9113bb74f7c3ed7f83a1243fb575&time=1463616821&uid=sunrise_win%40163.com>
>>>  在线预览 
>>> <http://preview.mail.163.com/preview?mid=xtbB0QpumlUL%2BmgE3wAAs4&part=3&sign=de2b9113bb74f7c3ed7f83a1243fb575&time=1463616821&uid=sunrise_win%40163.com>
>> 
>>  邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
>> 共有 1 个附件
>> Screen Shot 2016-05-18 at 5.09.29 PM.png(155K)
>> 极速下载 
>> <http://preview.mail.163.com/xdownload?filename=Screen+Shot+2016-05-18+at+5.09.29+PM.png&mid=1tbiPhRumlXlgWp0SwAAs6&part=3&sign=e0cf0eb619175e79cfa81bad7c1d26c9&time=1463622289&uid=sunrise_win%40163.com>
>>  在线预览 
>> <http://preview.mail.163.com/preview?mid=1tbiPhRumlXlgWp0SwAAs6&part=3&sign=e0cf0eb619175e79cfa81bad7c1d26c9&time=1463622289&uid=sunrise_win%40163.com><Screen
>>  Shot 2016-05-18 at 5.09.29 PM.png>
> 
> 
> 

Reply via email to