I'm a long-time user and recently started to lurk on the -devel list. This problem may belong on -user but it seems like a problem for developers rather than my fellow users.

I have a set of book in a SQLite3 file and I'm trying to save it in PostgreSQL. It's still currently running. Here's a couple of lines from top:

    PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM TIME+ COMMAND
  42022 ingram    20   0 1600828 796256  93648 R 100.0   9.9 125:28.52 gnucash

It keeps one CPU close to pegged. I can't tell what it's doing. The Save As dialog is still visible with the Save As button looking like it's been pressed. I don't see any relevant activity in system logs or GnuCash logs. As far as I can tell, it's never connected to the PostgreSQL server. It hasn't created any tables. I created the database after I aborted a previous try where the database didn't exist yet.

I'm using GnuCash 3.8, Build ID: 3.8b+(2019-12-29), as distributed with Kubuntu/Focal. I've also used a 3.8 from a flatpak on a machine with an older version of Kubuntu.

It's a pretty big data set: 112M SQLIte3 file with roughly 10K accounts, 50K transactions, and 160K splits.

Is there hope it'll start writing to the database?

I searched for information related to what I'm seeing and most of what I found seemed to be about GnuCash 2.8 and earlier. There was some discussion about revamping GnuCash to take better advantage of SQL and that, at that time, it was still reading the entire database into memory.

Are things different now? Is there a performance gain to be had with a SQL back end? I switched from XML to SQLite3 because it seemed like the program was bogging down. And that's why I'm looking to try PostgreSQL now.

I've run into what may be a similar problem where I can no longer import transactions. Or I wasn't willing to wait long enough. Something I read back then suggested that some part of matching transactions to accounts involved a sort of exponential growth in the work it was doing. That's probably not clear but whatever it was led me to conclude that I had too much data for the program to handle. I work around the issue by importing into an almost empty set of accounts and the cut and paste into my official books.

Thanks for all your work,

- Greg

_______________________________________________
gnucash-devel mailing list
gnucash-devel@gnucash.org
https://lists.gnucash.org/mailman/listinfo/gnucash-devel

Reply via email to