rson.
>>>
>>> 2015-06-25 14:26 GMT-07:00 Eskilson,Aleksander >> >:
>>>
>>>> Sure, I had a similar question that Shivaram was able fast for me,
>>>> the solution is implemented using a separate DataBrick’s library. Check out
>>>>
>> solution is implemented using a separate DataBrick’s library. Check out
>>> this thread from the email archives [1], and the read.df() command [2]. CSV
>>> files can be a bit tricky, especially with inferring their schemas. Are you
>>> using just strings as you
umn types right now?
>>
>> Alek
>>
>> [1] --
>> http://apache-spark-developers-list.1001551.n3.nabble.com/CSV-Support-in-SparkR-td12559.html
>> [2] -- https://spark.apache.org/docs/latest/api/R/read.df.html
>>
>> From: Wei Zhou
>> Date
Date: Thursday, June 25, 2015 at 4:38 PM
> To: Aleksander Eskilson
> Cc: "shiva...@eecs.berkeley.edu" , "
> user@spark.apache.org"
>
> Subject: Re: sparkR could not find function "textFile"
>
> I tried out the solution using spark-csv package, and it
u>>,
"user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: sparkR could not find function "textFile"
I tried out the solution using spark-csv package, and it worked fine now :)
Thanks. Yes, I'm playing with a fi
s/latest/api/R/read.df.html
>
> From: Wei Zhou
> Date: Thursday, June 25, 2015 at 4:15 PM
> To: "shiva...@eecs.berkeley.edu"
> Cc: Aleksander Eskilson , "user@spark.apache.org"
>
> Subject: Re: sparkR could not find function "textFile"
>
>
gt;>>>> was made private, as the devs felt many of its functions were too low
>>>>>> level. They focused instead on finishing the DataFrame API which supports
>>>>>> local, HDFS, and Hive/HBase file reads. In the meantime, the devs are
>>>>>>
2D7230&d=AwMFaQ&c=NRtzTzKNaCCmhN_9N2YJR-XrNU1huIgYP99yDsEzaJo&r=0vZw1rBdgaYvDJYLyKglbrax9kvQfRPdzxLUyWSyxPM&m=x60a-3ztBe4XOw2bOnEI9-Mc6mENXT8PVxYvsmTLVG8&s=HpX1Cpayu5Mwu9JVt2znimJyUwtV3vcPurUO9ZJhASo&e=>
From: Wei Zhou mailto:zhweisop...@gmail.com>>
Date: Thursday, June 25, 2015 at
gt;>>>> public again. You can see the rationale behind this decision on the
>>>>> issue’s
>>>>> JIRA [1].
>>>>>
>>>>> You can still make use of those now private RDD functions by
>>>>> prepending the function cal
o
>>>> determine which functions of the RDD API, if any, should be made public
>>>> again. You can see the rationale behind this decision on the issue’s JIRA
>>>> [1].
>>>>
>>>> You can still make use of those now private RDD funct
t;
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: sparkR could not find function "textFile"
Hi Alek,
Just a follow up question. This is what I did in sparkR shell:
lines <- SparkR:::textFile(sc, "./README.md
ate RDD functions by
>>> prepending the function call with the SparkR private namespace, for
>>> example, you’d use
>>> SparkR:::textFile(…).
>>>
>>> Hope that helps,
>>> Alek
>>>
>>> [1] -- https://issues.apache.org/jira/br
still make use of those now private RDD functions by prepending
>> the function call with the SparkR private namespace, for example, you’d use
>> SparkR:::textFile(…).
>>
>> Hope that helps,
>> Alek
>>
>> [1] -- https://issues.apache.org/jira/browse/SPARK-7
R:::textFile(…).
>
> Hope that helps,
> Alek
>
> [1] -- https://issues.apache.org/jira/browse/SPARK-7230
>
> From: Wei Zhou
> Date: Thursday, June 25, 2015 at 3:33 PM
> To: "user@spark.apache.org"
> Subject: sparkR could not find function "textFile&qu
"user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: sparkR could not find function "textFile"
Hi all,
I am exploring sparkR by activating the shell and following the tutorial here
https://amplab-extras.github.io/SparkR-
Hi all,
I am exploring sparkR by activating the shell and following the tutorial
here https://amplab-extras.github.io/SparkR-pkg/
And when I tried to read in a local file with textFile(sc,
"file_location"), it gives an error could not find function "textFile".
By reading through sparkR doc for 1
16 matches
Mail list logo