@spark.apache.org
Subject: Re: SparkR read.df failed to read file from local directory
Thanks for the comment Felix, I tried giving
"/home/myuser/test_data/sparkR/flights.csv", but it tried to search the path in
hdfs, and gave errors:
15/12/08 12:47:10 ERROR r.RBackendHandler:
t;com.databricks.spark.csv", header = "true")
>
>
>
> _
> From: Boyu Zhang
> Sent: Tuesday, December 8, 2015 8:47 AM
> Subject: SparkR read.df failed to read file from local directory
> To:
>
>
>
> Hello everyone,
>
&
Have you tried
flightsDF <- read.df(sqlContext, "/home/myuser/test_data/sparkR/flights.csv",
source = "com.databricks.spark.csv", header = "true")
_
From: Boyu Zhang
Sent: Tuesday, December 8, 2015 8:47 AM
Subject: SparkR
Hello everyone,
I tried to run the example data--manipulation.R, and can't get it to read
the flights.csv file that is stored in my local fs. I don't want to store
big files in my hdfs, so reading from a local fs (lustre fs) is the desired
behavior for me.
I tried the following:
flightsDF <- rea