Hi,
I am using the Java api of Spark.
I wanted to know if there is a way to run some code in a manner that is
like the setup() and cleanup() methods of Hadoop Map/Reduce
The reason I need it is because I want to read something from the DB
according to each record I scan in my Function, and I wou
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/rdd/JdbcRDD.scala
> )
>
>
> 2014-07-24 18:32 GMT+08:00 Yosi Botzer :
>
> Hi,
>>
>> I am using the Java api of Spark.
>>
>> I wanted to know if there is a way to run some code in
er this topic
> http://www.mapr.com/developercentral/code/loading-hbase-tables-spark
>
>
> 2014-07-24 22:32 GMT+08:00 Yosi Botzer :
>
> In my case I want to reach HBase. For every record with userId I want to
>> get some extra information about the user and add it to r