You could try to use akka actor system with apache spark, if you are
intending to use it in online / interactive job execution scenario.

On Sat, Nov 14, 2015, 08:19 Sabarish Sasidharan <
sabarish.sasidha...@manthan.com> wrote:

> You are probably trying to access the spring context from the executors
> after initializing it at the driver. And running into serialization issues.
>
> You could instead use mapPartitions() and initialize the spring context
> from within that.
>
> That said I don't think that will solve all of your issues because you
> won't be able to use the other rich transformations in Spark.
>
> I am afraid these two don't gel that well, unless and otherwise all your
> context lookups for beans happen in the driver.
>
> Regards
> Sab
> On 13-Nov-2015 4:17 pm, "Netai Biswas" <mail2efo...@gmail.com> wrote:
>
>> Hi,
>>
>> I am facing issue while integrating spark with spring.
>>
>> I am getting "java.lang.IllegalStateException: Cannot deserialize
>> BeanFactory with id" errors for all beans. I have tried few solutions
>> available in web. Please help me out to solve this issue.
>>
>> Few details:
>>
>> Java : 8
>> Spark : 1.5.1
>> Spring : 3.2.9.RELEASE
>>
>> Please let me know if you need more information or any sample code.
>>
>> Thanks,
>> Netai
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-and-Spring-Integrations-tp25375.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>

Reply via email to