Hi guys,
I'm still trying to solve the issue with saving Hibernate entities from
Spark. After several attempts to redesign my own code I ended up with
HelloWorld example which clearly demonstrates that it's not the problem in
complexity of my code and session mixing in threads.
The code given bel
I have GenericDAO class which is initialized for each partition. This class
uses SessionFactory.openSession() to open a new session in it's
constructor. As per my understanding, this means that each partition have
different session, but they are using the same SessionFactory to open it.
why not cr
I agree with Igor - I would either make sure session is ThreadLocal or,
more simply, why not create the session at the start of the saveInBatch
method and close it at the end? Creating a SessionFactory is an expensive
operation but creating a Session is a relatively cheap one.
On 6 Sep 2015 07:27,
how do you create your session? do you reuse it across threads? how do you
create/close session manager?
look for the problem in session creation, probably something deadlocked, as
far as I remember hib.session should be created per thread
On 6 September 2015 at 07:11, Zoran Jeremic wrote:
> Hi,
Hi,
I'm developing long running process that should find RSS feeds that all
users in the system have registered to follow, parse these RSS feeds,
extract new entries and store it back to the database as Hibernate
entities, so user can retrieve it. I want to use Apache Spark to enable
parallel proc