Well,

I have a personal project where I want to build a *spreadsheet *on top of
spark.
I have a version of my app running on postgresql, which does not scale, and
would like to move data processing to spark.
You can import data, explore data, analyze data, visualize data ...
You don't need to be an advanced technical user to use it.
I believe it would be much easier to use spark than postgresql for this
kind of dynamic data exploration.
For instance, a formula can be achieved with a simple #.map statement.

However, I need some kind of connection to spark to iterate with RDD.
This is what makes me wonder of how you can talk to spark from a web app.










Le dim. 28 févr. 2016 à 23:36, ayan guha <guha.a...@gmail.com> a écrit :

> I believe you are looking  for something like Spark Jobserver for running
> jobs & JDBC server for accessing data? I am curious to know more about it,
> any further discussion will be very helpful
>
> On Mon, Feb 29, 2016 at 6:06 AM, Luciano Resende <luckbr1...@gmail.com>
> wrote:
>
>> One option we have used in the past is to expose spark application
>> functionality via REST, this would enable python or any other client that
>> is capable of doing a HTTP request to integrate with your Spark application.
>>
>> To get you started, this might be a useful reference
>>
>>
>> http://blog.michaelhamrah.com/2013/06/scala-web-apis-up-and-running-with-spray-and-akka/
>>
>>
>> On Sun, Feb 28, 2016 at 10:38 AM, moshir mikael <moshir.mik...@gmail.com>
>> wrote:
>>
>>> Ok,
>>> but what do I need for the program to run.
>>> In python  sparkcontext  = SparkContext(conf) only works when you have
>>> spark installed locally.
>>> AFAIK there is no *pyspark *package for python that you can install
>>> doing pip install pyspark.
>>> You actually need to install spark to get it running (e.g :
>>> https://github.com/KristianHolsheimer/pyspark-setup-guide).
>>>
>>> Does it mean you need to install spark on the box your applications runs
>>> to benefit from pyspark and this is required to connect to another remote
>>> spark cluster ?
>>> Am I missing something obvious ?
>>>
>>>
>>> Le dim. 28 févr. 2016 à 19:01, Todd Nist <tsind...@gmail.com> a écrit :
>>>
>>>> Define your SparkConfig to set the master:
>>>>
>>>>   val conf = new SparkConf().setAppName(AppName)
>>>>     .setMaster(SparkMaster)
>>>>     .set(....)
>>>>
>>>> Where SparkMaster = "spark://SparkServerHost:7077".  So if your spark
>>>> server hostname it "RADTech" then it would be "spark://RADTech:7077".
>>>>
>>>> Then when you create the SparkContext, pass the SparkConf  to it:
>>>>
>>>>     val sparkContext = new SparkContext(conf)
>>>>
>>>> Then use the sparkContext for interact with the SparkMaster / Cluster.
>>>> Your program basically becomes the driver.
>>>>
>>>> HTH.
>>>>
>>>> -Todd
>>>>
>>>> On Sun, Feb 28, 2016 at 9:25 AM, mms <moshir.mik...@gmail.com> wrote:
>>>>
>>>>> Hi, I cannot find a simple example showing how a typical application
>>>>> can 'connect' to a remote spark cluster and interact with it. Let's say I
>>>>> have a Python web application hosted somewhere *outside *a spark
>>>>> cluster, with just python installed on it. How can I talk to Spark without
>>>>> using a notebook, or using ssh to connect to a cluster master node ? I 
>>>>> know
>>>>> of spark-submit and spark-shell, however forking a process on a remote 
>>>>> host
>>>>> to execute a shell script seems like a lot of effort What are the
>>>>> recommended ways to connect and query Spark from a remote client ? Thanks
>>>>> Thx !
>>>>> ------------------------------
>>>>> View this message in context: Spark Integration Patterns
>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Integration-Patterns-tp26354.html>
>>>>> Sent from the Apache Spark User List mailing list archive
>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>>>
>>>>
>>>>
>>
>>
>> --
>> Luciano Resende
>> http://people.apache.org/~lresende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>>
>
>
>
> --
> Best Regards,
> Ayan Guha
>

Reply via email to