This is a general question about whether Spark Streaming can be interactive
like batch Spark jobs. I've read plenty of threads and done my fair bit of
experimentation and I'm thinking the answer is NO, but it does not hurt to
ask.
More specifically, I would like to be able to do:
1. Add/Remove st
Not that what TD was referring above, is already in 1.0.0
http://spark.apache.org/docs/1.0.0/streaming-custom-receivers.html
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/custom-receiver-in-java-tp3575p6962.html
Sent from the Apache Spark User List mailin
I'm running a 1.0.0 standalone cluster based on amplab/dockerscripts with 3
workers. I'm testing out spark-submit and I'm getting errors using
*--deploy-mode cluster* and using an http:// url to my JAR. I'm getting the
following error back.
Sending launch command to spark://master:7077
Driver succ
Are there any workarounds for this? Seems to be a dead end so far.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649p11502.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
I'm running on spark 1.0.0 and I see a similar problem when using the
socketTextStream receiver. The ReceiverTracker task sticks around after a
ssc.stop(false).
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Stopping-StreamingContext-does-not-kill-receiver-
Sorry about the screenshot… but that is what I have handy right now. You can
see that we get a WARN and it ultimately say that it stopped successfully.
When looking that the application in Spark UI, it still shows the
ReceiverTracker task running.
It is easy to recreate. On the spark repl we are r
I've see this same exact problem too and I've been ignoring, but I wonder if
I'm loosing data. Can anyone at least comment on this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFound-for-user-class-in-uber-jar-tp10613p11902.html
Sent from the Apac
Not sure if this problem reached the Spark guys because it shows in Nabble
that "This post has NOT been accepted by the mailing list yet".
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFound-for-user-class-in-uber-jar-td10613.html#a11902
I'm resubmitting.
Greetings,
I'm currentl
This was posted on the Dev list, but it is very relevant to the user list as
well…
--
We are happy to announce a developer preview of the Spark Kernel which
enables remote applications to dynamically interact