Thanks for the clarification.
Matei Zaharia [via Apache Spark Developers List] <
ml+s1001551n21526...@n3.nabble.com> schrieb am Mo. 8. Mai 2017 um 03:51:
> More specifically, many user applications that link to Spark also linked
> to Akka as a library (e.g. say you want to write a service that rec
More specifically, many user applications that link to Spark also linked to
Akka as a library (e.g. say you want to write a service that receives requests
from Akka and runs them on Spark). In that case, you'd have two conflicting
versions of the Akka library in the same JVM.
Matei
> On May 7,
The point is that Spark's prior usage of Akka was limited enough that it
could fairly easily be removed entirely instead of forcing particular
architectural decisions on Spark's users.
On Sun, May 7, 2017 at 1:14 PM, geoHeil wrote:
> Thank you!
> In the issue they outline that hard wired depende
Thank you!
In the issue they outline that hard wired dependencies were the problem.
But wouldn't one want to not directly accept the messages from an actor but
have Kafka as an failsafe intermediary?
zero323 [via Apache Spark Developers List] <
ml+s1001551n21523...@n3.nabble.com> schrieb am So., 7
https://issues.apache.org/jira/browse/SPARK-5293
On 05/07/2017 08:59 PM, geoHeil wrote:
> Hi,
>
> I am curious why spark (with 2.0 completely) removed any akka dependencies
> for RPC and switched entirely to (as far as I know natty)
>
> regards,
> Georg
>
>
>
> --
> View this message in context:
Hi,
I am curious why spark (with 2.0 completely) removed any akka dependencies
for RPC and switched entirely to (as far as I know natty)
regards,
Georg
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Why-did-spark-switch-from-AKKA-to-net-tp21522.html