I took a look at the spring-spark project. I still don't see how the bean
factory would be available at the executor when the deserializer needs it.
Unless I am horribly wrong, you need to modify the spring-spark project in
the SpringBuilder class and initialize the spring context if the static
BEAN_FACTORY is null.

Alternatively you could use a simpler approach as shown below

rdd.mapPartitions(...{
    ApplicationContext context =
SingletonApplicationContext.getSpringContext(); //initialize the spring
context if not initialized already and store it in a static variable
    SpringBean springBean = context.getBean("springBean"); //note that all
dependencies will be autowired
    springBean.doSomething();
})

You need to come up with the SingletonApplicationContext class which safely
creates a singleton spring context.

Because executors are long-lived vms, your spring context will only be
created once per jvm irrespective of how many transformations/actions you
call on your rdds. So unlike what I said before, you could use
mapPartitions() or map() or really any transformations/actions. But all
your functions would have to the first 2 lines repeatedly (ie initialize
context if not present and acquire the spring bean). You could think of
encapsulating somehow so that it is easier to do.

Regards
Sab

On Sun, Nov 15, 2015 at 11:50 AM, Netai Biswas <mail2efo...@gmail.com>
wrote:

> Hi,
>
> Thanks for your response. I would like to inform you what exactly we are
> trying to achieve. I am not sure if we will can use mapPartitions() here
> or not.
>
> Sample Code:
>
> @Autowiredprivate SpringBean springBean;
> public void test() throws Exception {
>     SparkConf conf = new SparkConf().setAppName("APP").setMaster(masterURL);
>     conf.set("spark.serializer", 
> "de.paraplu.springspark.serialization.SpringAwareSerializer");
>    sc = new JavaSparkContext(conf);
>
> sc.parallelize(list).foreach(new VoidFunction<String>() {
>     private static final long serialVersionUID = 1L;
>
>         @Override
>         public void call(String t) throws Exception {
>             springBean.someAPI(t); // here we will have db transaction as 
> well.
>         }
>     });}
>
> Thanks,
> Netai
>
> On Sat, Nov 14, 2015 at 9:49 PM, Sabarish Sasidharan <
> sabarish.sasidha...@manthan.com> wrote:
>
>> You are probably trying to access the spring context from the executors
>> after initializing it at the driver. And running into serialization issues.
>>
>> You could instead use mapPartitions() and initialize the spring context
>> from within that.
>>
>> That said I don't think that will solve all of your issues because you
>> won't be able to use the other rich transformations in Spark.
>>
>> I am afraid these two don't gel that well, unless and otherwise all your
>> context lookups for beans happen in the driver.
>>
>> Regards
>> Sab
>> On 13-Nov-2015 4:17 pm, "Netai Biswas" <mail2efo...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am facing issue while integrating spark with spring.
>>>
>>> I am getting "java.lang.IllegalStateException: Cannot deserialize
>>> BeanFactory with id" errors for all beans. I have tried few solutions
>>> available in web. Please help me out to solve this issue.
>>>
>>> Few details:
>>>
>>> Java : 8
>>> Spark : 1.5.1
>>> Spring : 3.2.9.RELEASE
>>>
>>> Please let me know if you need more information or any sample code.
>>>
>>> Thanks,
>>> Netai
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-and-Spring-Integrations-tp25375.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>


-- 

Architect - Big Data
Ph: +91 99805 99458

Manthan Systems | *Company of the year - Analytics (2014 Frost and Sullivan
India ICT)*
+++

Reply via email to