Hello

I installed spark in a folder. I start bin/sparkR on console. Then I
execute below command and all work fine. I can see the data as well.

hivecontext <<- sparkRHive.init(sc) ;
df <- loadDF(hivecontext, "/someHdfsPath", "orc")
showDF(df)


But when I give same to rstudio, it throws the error mentioned below

rstudio code
============
Sys.setenv(SPARK_HOME="/home/myname/spark-1.6.0-bin-hadoop2.6")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)

sc <- sparkR.init(master="local")
hivecontext <<- sparkRHive.init(sc) ;
df <- loadDF(hivecontext, "/someHdfsPath", "orc")
print("showing df now")
showDF(df)

Error thrown from rstudio
===================

log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties16/01/22 06:00:12 ERROR
RBackendHandler: createSparkContext on org.apache.spark.api.r.RRDD
failedError in invokeJava(isStatic = TRUE, className, methodName, ...)
:



 What is different in rstudio than sparkR shell ? Should I change any
setting to make it work in rstudio ?

Reply via email to