1. Not much we can do about that. Presumably that is why the import match editor was created so that fat finger tokens can be located and deleted. I still have mixed feelings about pruning the table. Ok in the case where you have a known wrong entry as above but less sure whether taking connector tokens out will not adversely affect the ability to score higher on a phrase that is used consistently in a description/memo field for example.
2. I suspect doing this on the fly would create too much of a performance hit as some/many people have large files with thousands of transactions as GnuCash does not require new file creation annually. I would build a procedure that can be run on an account whenever desired to recreate the frequency table data from the existing transaction transfer accounts and replace the existing data. User's need to select the account to run it for and the date range from which to use transactions to construct the table for the cases where 5 years agon someone used a different account structure. Should not be too hard as the processes for tokenizing transactions already exist in the matcher code. If it can be run as a standalone then it can be tested to see what effect it will have if it was run on the fly during import. 3. Not sure on that. I think it is likely that only the transaction data is moved to the new account but not sure. The data may be all read into memory initially so it shouldn't be too hard to write it to a merged account. On the otherhand if a standalone as in 2 is created all that is needed is to execute it after the merge and Bob's your uncle. David ----- David Cousens -- Sent from: http://gnucash.1415818.n4.nabble.com/GnuCash-Dev-f1435356.html _______________________________________________ gnucash-devel mailing list gnucash-devel@gnucash.org https://lists.gnucash.org/mailman/listinfo/gnucash-devel