Hi Alex,
thanks for the link. Will check it.
Does someone know of a more streamlined approach ?




Le lun. 29 févr. 2016 à 10:28, Alex Dzhagriev <dzh...@gmail.com> a écrit :

> Hi Moshir,
>
> I think you can use the rest api provided with Spark:
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala
>
> Unfortunately, I haven't find any documentation, but it looks fine.
> Thanks, Alex.
>
> On Sun, Feb 28, 2016 at 3:25 PM, mms <moshir.mik...@gmail.com> wrote:
>
>> Hi, I cannot find a simple example showing how a typical application can
>> 'connect' to a remote spark cluster and interact with it. Let's say I have
>> a Python web application hosted somewhere *outside *a spark cluster,
>> with just python installed on it. How can I talk to Spark without using a
>> notebook, or using ssh to connect to a cluster master node ? I know of
>> spark-submit and spark-shell, however forking a process on a remote host to
>> execute a shell script seems like a lot of effort What are the recommended
>> ways to connect and query Spark from a remote client ? Thanks Thx !
>> ------------------------------
>> View this message in context: Spark Integration Patterns
>> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Integration-Patterns-tp26354.html>
>> Sent from the Apache Spark User List mailing list archive
>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>
>
>

Reply via email to