Ironpython shares with python only the syntax - at best. It is a scripting
language within the .NET framework. Many applications have this for
scripting the application itself. This won't work for you. You can use
pipes or write your spark jobs in java/scala/r and submit them via your
.NET framework. Alternatively, you create java spark jobs as skeleton and
you call your .NET libraries via jni. Depends a little bit on your use
case. From an architecture point of view you should be careful about memory
management in these constellations...

Le jeu. 2 juil. 2015 à 10:33, Zwits <daniel.van...@ortec-finance.com> a
écrit :

> I'm currently looking into a way to run a program/code (DAG) written in
> .NET
> on a cluster using Spark. However I ran into problems concerning the coding
> language, Spark has no .NET API.
> I tried looking into IronPython because Spark does have a Python API, but i
> couldn't find a way to use this.
>
> Is there a way to implement a DAG of jobs on a cluster using .NET
> programming language?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NET-on-Apache-Spark-tp23578.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to