So here's the funny thing: I've now ran my function 5 times, and in each one
of them R crashes after I got around 20.000 distances. It could be google,
but then as soon as I quit and launch R again, I manage to get another
20.000 distances. So maybe it does have something to do with the memory
usage. Do you think that adding

> gc(reset=T)

at the end of each loop would do good?

Thanks again,
Dimitri



2009/11/15 Barry Rowlingson <b.rowling...@lancaster.ac.uk>

> On Sun, Nov 15, 2009 at 11:57 AM, Dimitri Szerman <dimitri...@gmail.com>
> wrote:
>
> >
> > Thanks. The reason I didn't want to do something like that is because, in
> > the event of a crash, I'll loose everything that was done. That's why I
> > though of appending the results often.
>
>  Oops yes, I missed the 'append=TRUE' flag. That's a good idea.
>
>  Last time I did something similar to this I used a relational
> database for saving. I created a table of all the i,j pairs with
> columns i,j,distance and 'ok'. 'ok' was set to False initially. Then
> I'd query the db for a row with 'ok=False', and go about getting the
> distance. If I got a good distance back I set 'ok=True' and never
> bothered getting that again.
>
>  This was in Python with SQLite as the database engine, but you can
> do something similar in R. With a distributed database you could
> easily split the queries between as many servers as you can get your
> hands on.
>
>  Barry
>

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to