Hi

Not sure about Spring boot but trying to use DI libraries you'll run into 
serialization issues.    I've had luck using an old version of Scaldi.  
Recently though I've been passing the class types as arguments with default 
values.  Then in the spark code it gets instantiated.  So you're basically 
passing and serializing a class name.

Sent from my iPhone

> On Dec 28, 2016, at 1:55 PM, Lars Albertsson <la...@mapflat.com> wrote:
> 
> Do you really need dependency injection?
> 
> DI is often used for testing purposes. Data processing jobs are easy
> to test without DI, however, due to their functional and synchronous
> nature. Hence, DI is often unnecessary for testing data processing
> jobs, whether they are batch or streaming jobs.
> 
> Or do you want to use DI for other reasons?
> 
> 
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> https://twitter.com/lalleal
> +46 70 7687109
> Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com
> 
> 
> On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
> <chetan.opensou...@gmail.com> wrote:
>> Hello Community,
>> 
>> Current approach I am using for Spark Job Development with Scala + SBT and
>> Uber Jar with yml properties file to pass configuration parameters. But If i
>> would like to use Dependency Injection and MicroService Development like
>> Spring Boot feature in Scala then what would be the standard approach.
>> 
>> Thanks
>> 
>> Chetan
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to