Re: SparkR 1.4.0: read.df() function fails

2015-06-17 Thread Stensrud, Erik
Thanks to both of you! You solved the problem. Thanks Erik Stensrud Sendt fra min iPhone Den 16. jun. 2015 kl. 20.23 skrev Guru Medasani mailto:gdm...@gmail.com>>: Hi Esten, Looks like your sqlContext is connected to a Hadoop/Spark cluster, but the file path you specified is local?. mydf<-r

Re: SparkR 1.4.0: read.df() function fails

2015-06-16 Thread nsalian
Hello, Is the json file in HDFS or local? "/home/esten/ami/usaf.json" is this an HDFS path? Suggestions: 1) Specify "file:/home/esten/ami/usaf.json" 2) Or move the usaf.json file into HDFS since the application is looking for the file in HDFS. Please let me know if that helps. Thank you. --

Re: SparkR 1.4.0: read.df() function fails

2015-06-16 Thread Guru Medasani
Hi Esten, Looks like your sqlContext is connected to a Hadoop/Spark cluster, but the file path you specified is local?. mydf<-read.df(sqlContext, "/home/esten/ami/usaf.json", source="json”, Error below shows that the Input path you specified does not exist on the cluster. Pointing to the righ

Re: SparkR 1.4.0: read.df() function fails

2015-06-16 Thread Shivaram Venkataraman
The error you are running into is that the input file does not exist -- You can see it from the following line "Input path does not exist: hdfs://smalldata13.hdp:8020/ home/esten/ami/usaf.json" Thanks Shivaram On Tue, Jun 16, 2015 at 1:55 AM, esten wrote: > Hi, > In SparkR shell, I invoke: > >