rson.
>>>
>>> 2015-06-25 14:26 GMT-07:00 Eskilson,Aleksander >> >:
>>>
>>>> Sure, I had a similar question that Shivaram was able fast for me,
>>>> the solution is implemented using a separate DataBrick’s library. Check out
>>>>
>> solution is implemented using a separate DataBrick’s library. Check out
>>> this thread from the email archives [1], and the read.df() command [2]. CSV
>>> files can be a bit tricky, especially with inferring their schemas. Are you
>>> using just strings as you
umn types right now?
>>
>> Alek
>>
>> [1] --
>> http://apache-spark-developers-list.1001551.n3.nabble.com/CSV-Support-in-SparkR-td12559.html
>> [2] -- https://spark.apache.org/docs/latest/api/R/read.df.html
>>
>> From: Wei Zhou
>> Date
Date: Thursday, June 25, 2015 at 4:38 PM
> To: Aleksander Eskilson
> Cc: "shiva...@eecs.berkeley.edu" , "
> user@spark.apache.org"
>
> Subject: Re: sparkR could not find function "textFile"
>
> I tried out the solution using spark-csv package, and it
u>>,
"user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: sparkR could not find function "textFile"
I tried out the solution using spark-csv package, and it worked fine now :)
Thanks. Yes, I'm playing with a fi
s/latest/api/R/read.df.html
>
> From: Wei Zhou
> Date: Thursday, June 25, 2015 at 4:15 PM
> To: "shiva...@eecs.berkeley.edu"
> Cc: Aleksander Eskilson , "user@spark.apache.org"
>
> Subject: Re: sparkR could not find function "textFile"
>
>
Thanks Shivaram, this is exactly what I am looking for.
2015-06-25 14:22 GMT-07:00 Shivaram Venkataraman :
> You can use the Spark CSV reader to do read in flat CSV files to a data
> frame. See https://gist.github.com/shivaram/d0cd4aa5c4381edd6f85 for an
> example
>
> Shivaram
>
> On Thu, Jun 25,
, 2015 at 4:15 PM
To: "shiva...@eecs.berkeley.edu<mailto:shiva...@eecs.berkeley.edu>"
mailto:shiva...@eecs.berkeley.edu>>
Cc: Aleksander Eskilson
mailto:alek.eskil...@cerner.com>>,
"user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spa
You can use the Spark CSV reader to do read in flat CSV files to a data
frame. See https://gist.github.com/shivaram/d0cd4aa5c4381edd6f85 for an
example
Shivaram
On Thu, Jun 25, 2015 at 2:15 PM, Wei Zhou wrote:
> Thanks to both Shivaram and Alek. Then if I want to create DataFrame from
> comma s
Thanks to both Shivaram and Alek. Then if I want to create DataFrame from
comma separated flat files, what would you recommend me to do? One way I
can think of is first reading the data as you would do in r, using
read.table(), and then create spark DataFrame out of that R dataframe, but
it is obvi
t;
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: sparkR could not find function "textFile"
Hi Alek,
Just a follow up question. This is what I did in sparkR shell:
lines <- SparkR:::textFile(sc, "./README.md
The `head` function is not supported for the RRDD that is returned by
`textFile`. You can run `take(lines, 5L)`. I should add a warning here that
the RDD API in SparkR is private because we might not support it in the
upcoming releases. So if you can use the DataFrame API for your application
you s
Hi Alek,
Just a follow up question. This is what I did in sparkR shell:
lines <- SparkR:::textFile(sc, "./README.md")
head(lines)
And I am getting error:
"Error in x[seq_len(n)] : object of type 'S4' is not subsettable"
I'm wondering what did I do wrong. Thanks in advance.
Wei
2015-06-25 13:
Hi Alek,
Thanks for the explanation, it is very helpful.
Cheers,
Wei
2015-06-25 13:40 GMT-07:00 Eskilson,Aleksander :
> Hi there,
>
> The tutorial you’re reading there was written before the merge of SparkR
> for Spark 1.4.0
> For the merge, the RDD API (which includes the textFile() function
Hi there,
The tutorial you’re reading there was written before the merge of SparkR for
Spark 1.4.0
For the merge, the RDD API (which includes the textFile() function) was made
private, as the devs felt many of its functions were too low level. They
focused instead on finishing the DataFrame API
15 matches
Mail list logo