I don't think you can share dbi connections across different instances of R.
just have each of your helper functions open a local connection. or alternatively, load a package on each instance which keeps a dbi connection open. and make sure you bump up your allowed number of connections in pg_conf if you need to. -Whit On Mon, Jun 6, 2011 at 12:40 PM, Florian Endel <flor...@endel.at> wrote: > Dear expeRts, > > I'm currently trying to get data from a PostgreSQL database _in parallel_. > I tried two methods: > * passing the DBI Connection object to the cluster nodes > * connecting the the DB on each node > > (1) > The execution of the first method looks like this: >> result = sfClusterApplyLB(input, fun, dbiCon) > and produces an "expired PostgreSQLConnection" error. > (Of course the passed Connection Object is usable at that moment and > afterwards!) > > (2) > For the creation of DB connections on every node a function handling > the whole connection is sourced into every node. > This function works perfectly without snowfall. > Calling it with >> sfClusterEval(dbConnect()) > again only expired connection objects are produced. Even if I create > the connection 'a line above' the code which is connecting to the DB > it doesn't work... > > > Is there a possibility to connect to PostgreSQL using snowfall? > > -- > with kind regards > Florian > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.