Re: Dependency Injection and Microservice development with Spark

2017-01-04 Thread darren
ubject: Re: Dependency Injection and Microservice development with Spark Lars, Thank you, I want to use DI for configuring all the properties (wiring) for below architectural approach. Oracle -> Kafka Batch (Event Queuing) -> Spark Jobs( Incremental load from HBase -> Hive with Trans

Re: Dependency Injection and Microservice development with Spark

2017-01-04 Thread Jiří Syrový
Hi, another nice approach is to use instead of it Reader monad and some framework to support this approach (e.g. Grafter - https://github.com/zalando/grafter). It's lightweight and helps a bit with dependencies issues. 2016-12-28 22:55 GMT+01:00 Lars Albertsson : > Do you really need dependency

Re: Dependency Injection and Microservice development with Spark

2017-01-04 Thread Chetan Khatri
Lars, Thank you, I want to use DI for configuring all the properties (wiring) for below architectural approach. Oracle -> Kafka Batch (Event Queuing) -> Spark Jobs( Incremental load from HBase -> Hive with Transformation) -> Spark Transformation -> PostgreSQL Thanks. On Thu, Dec 29, 2016 at 3:2

Re: Dependency Injection and Microservice development with Spark

2016-12-28 Thread Miguel Morales
Hi Not sure about Spring boot but trying to use DI libraries you'll run into serialization issues.I've had luck using an old version of Scaldi. Recently though I've been passing the class types as arguments with default values. Then in the spark code it gets instantiated. So you're basic

Dependency Injection and Microservice development with Spark

2016-12-23 Thread Chetan Khatri
Hello Community, Current approach I am using for Spark Job Development with Scala + SBT and Uber Jar with yml properties file to pass configuration parameters. But If i would like to use Dependency Injection and MicroService Development like Spring Boot feature in Scala then what would be the stan