Okie doke. Thanks for the confirmation, Burak and Tathagata.

On Thu, Jul 10, 2014 at 2:23 AM, Tathagata Das <tathagata.das1...@gmail.com>
wrote:

> I confirm that is indeed the case.  It is designed to be so because it
> keeps things simpler - less chances of issues related to cleanup when
> stop() is called. Also it keeps things consistent with the spark context -
> once a spark context is stopped it cannot be used any more.
>
> You can create a new streaming context object, set it up and use it.
>
> TD
> On Jul 9, 2014 10:40 PM, "Burak Yavuz" <bya...@stanford.edu> wrote:
>
>> Someone can correct me if I'm wrong, but unfortunately for now, once a
>> streaming context is stopped, it can't be restarted.
>>
>> ----- Original Message -----
>> From: "Nick Chammas" <nicholas.cham...@gmail.com>
>> To: u...@spark.incubator.apache.org
>> Sent: Wednesday, July 9, 2014 6:11:51 PM
>> Subject: Restarting a Streaming Context
>>
>> So I do this from the Spark shell:
>>
>> // set things up// <snipped>
>>
>> ssc.start()
>> // let things happen for a few minutes
>>
>> ssc.stop(stopSparkContext = false, stopGracefully = true)
>>
>> Then I want to restart the Streaming Context:
>>
>> ssc.start() // still in the shell; Spark Context is still alive
>>
>> Which yields:
>>
>> org.apache.spark.SparkException: StreamingContext has already been stopped
>>
>> How come? Is there any way in the interactive shell to restart a Streaming
>> Context once it is stopped?
>>
>> Nick
>> ​
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Restarting-a-Streaming-Context-tp9256.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>>

Reply via email to