Mark Johnson wrote:
This would certainly cause very slow performance. Here are some sample table sizes to give you an idea of how much data I have in gnucash. I don't think it is especially large. I have seen some users post who have quite a few more years of data in gnucash than I.
I suspect this is more a UI problem than a backend problem: the backend is slow, and isn't going to (practically) get any faster, meaning the user must the given some kind of cue that this a) might take a long time, and b) will only take a long time, the first time.
Perhaps some kind "export wizard" idea might work, in other words, if you loaded your data using backend X, under normal circumstances, gnucash will save to backend X.
If you loaded your data with backend X, but wanted to save it as backend Y, you would need to select a separate save option (called "export"?), which would have a progress bar, and would otherwise signal to the user that this is once off and may take a while, and most importantly, gnucash hasn't hung.
This will also simplify potential confusion when the user is given lots of options when they click on save.
Running further with the idea, "save" might mean "commit any unsaved changes". In the XML world, this means the XML file will be saved in full. In the database world, a long running transaction might be committed. Obviously in the database world, if long running transactions are a bad idea or aren't supported, save could just be greyed out.
Regards, Graham --
smime.p7s
Description: S/MIME Cryptographic Signature
_______________________________________________ gnucash-devel mailing list gnucash-devel@gnucash.org https://lists.gnucash.org/mailman/listinfo/gnucash-devel