RE: using R with Spark

2017-09-24 Thread Adaryl Wakefield
ww.linkedin.com/in/bobwakefieldmba> Twitter: @BobLovesData<http://twitter.com/BobLovesData> From: Felix Cheung [mailto:felixcheun...@hotmail.com] Sent: Sunday, September 24, 2017 6:56 PM To: Adaryl Wakefield ; user@spark.apache.org Subject: Re: using R with Spark There are other approa

Re: using R with Spark

2017-09-24 Thread Felix Cheung
@spark.apache.org Subject: RE: using R with Spark >It is free for use might need r studio server depending on which spark master >you choose. Yeah I think that’s where my confusion is coming from. I’m looking at a cheat sheet. For connecting to a Yarn Cluster the first step is; 1. I

RE: using R with Spark

2017-09-24 Thread Adaryl Wakefield
a> Twitter: @BobLovesData<http://twitter.com/BobLovesData> From: Georg Heiler [mailto:georg.kf.hei...@gmail.com] Sent: Sunday, September 24, 2017 3:39 PM To: Felix Cheung ; Adaryl Wakefield ; user@spark.apache.org Subject: Re: using R with Spark No. It is free for use might need r stud

Re: using R with Spark

2017-09-24 Thread Jules Damji
e R shell without RStudio > (but you probably want an IDE) > > > From: Adaryl Wakefield > Sent: Sunday, September 24, 2017 11:19:24 AM > To: user@spark.apache.org > Subject: using R with Spark > > There are two packages SparkR and sparklyr. Sparklyr seems to be the mor

Re: using R with Spark

2017-09-24 Thread Felix Cheung
an IDE) From: Adaryl Wakefield mailto:adaryl.wakefi...@hotmail.com>> Sent: Sunday, September 24, 2017 11:19:24 AM To: user@spark.apache.org<mailto:user@spark.apache.org> Subject: using R with Spark There are two packages SparkR and sparklyr. Sparklyr se

Re: using R with Spark

2017-09-24 Thread Georg Heiler
an IDE) > > -- > *From:* Adaryl Wakefield > *Sent:* Sunday, September 24, 2017 11:19:24 AM > *To:* user@spark.apache.org > *Subject:* using R with Spark > > > There are two packages SparkR and sparklyr. Sparklyr seems to be the more > useful. However, do you have to pay to use it?

Re: using R with Spark

2017-09-24 Thread Felix Cheung
Both are free to use; you can use sparklyr from the R shell without RStudio (but you probably want an IDE) From: Adaryl Wakefield Sent: Sunday, September 24, 2017 11:19:24 AM To: user@spark.apache.org Subject: using R with Spark There are two packages SparkR

using R with Spark

2017-09-24 Thread Adaryl Wakefield
There are two packages SparkR and sparklyr. Sparklyr seems to be the more useful. However, do you have to pay to use it? Unless I'm not reading this right, it seems you have to have the paid version of RStudio to use it. Adaryl "Bob" Wakefield, MBA Principal Mass Street Analytics, LLC 913.938.66