Thanks. That what I was thinking.
But how to setup connection per worker?
Il lunedì 25 luglio 2016, ayan guha ha scritto:
> In order to use existing pg UDF, you may create a view in pg and expose
> the view to hive.
> Spark to database connection happens from each executors, so you must have
>
In order to use existing pg UDF, you may create a view in pg and expose the
view to hive.
Spark to database connection happens from each executors, so you must have
a connection or a pool of connection per worker. Executors of the same
worker can share connection pool.
Best
Ayan
On 25 Jul 2016 16:
Hi all!
Among other use cases, I want to use spark as a distributed sql engine
via thrift server.
I have some tables in postegres and Cassandra: I need to expose them via
hive for custom reporting.
Basic implementation is simple and works, but I have some concerns and open
question:
- is there a be