Do not use an FPT from one table with another.

I am a little surprised that the PACK MEMO did not recover space. Are you sure you refreshed the disk folder where the .fpt was stored.

Also, since you had to have the table open exclusively to do a PACK MEMO, why not just do a complete PACK. If there are tons of deleted records in there the PACK MEMO may assume it cannot get rid of the associated MEMO data.

As a very tangential side point, you may want to specifically limit the VFP memory - SYS(3050, 1, xxx), SYS(3050, 2, xxx). I usually use 384000000 as xxx (384 MB). Others have a lot more details about the SYS(3050,...), but for me, I've found better stability by specifically setting these. And don't let the 384MB fool you - I've seen things get worse if I try to put a very large value in for xxx. VFP does fine in smaller memory spaces - just more writes to temp files. Anyway, you could try that memory limit and see if the PACK goes through fine (PACK vice PACK MEMO). I've had many DBF files grow over 1GB (close to the 2GB limit) that I PACK with no problems.

Oh, and if the application is not multi-user, I would recommend opening the table in EXCLUSIVE mode all the time. Doing that will allow memo field 'space' to be reused (if the new data size is not larger than the previous). In SHARED mode, when a record changes (and is saved) I think the memo data is duplicated into a new entry in the .fpt. It definitely is if the memo data itself changed - even if the new memo data is smaller than the original.

If your design really requires storing large amounts of data into the MEMO field, and if it's mainly text, consider compression - compress the data when storing/reading. I think Dynazip may be out there somewhere - and there are others.

Or, create additional tables, 1 memo field each (assuming your 8 memo fields hold roughly equivalent amounts of data). Then link them back to the main table - aka main tables primary key. Of course, this would impact your data queries when pulling data. And if you really need all 8 memo fields every time you pull data, this approach will probably hurt performance (8 joins - very fast, but not as fast as pulling from 1 table).

-Charlie

On 10/5/2020 11:08 AM, Frank Cazabon wrote:
I "think" the way to do this is to copy the data to a new table and then delete the old table and rename the new one to the old name, but it's been a really long time since I worked with DBFs so caveat emptor!

Be careful if the table is part of a database and you have specific configuration set for the table in the dbc (long column names, autoinc, triggers).

If that is going to be a problem maybe copy it to a new table, then zap the old table and append from the new table but you'd still have to look out for autoinc.

Frank.

Frank Cazabon

On 05/10/2020 10:37 am, Desmond Lloyd wrote:
Fascinating,  original size of the DBF 710448
original size of fpt: 1276848

selecting * from the tabl3
DBF = 710448
FPT =  923354

A drop in the size of the fpt of 353492.. Could i rename the "selected" FPT and use it.   I treid a pack memo and there no change in thesiz of the FPT?

Regards,
Desmond

/

On Mon, 5 Oct 2020 at 09:06, Tracy Pearson <tr...@powerchurch.com> wrote:

When you select * to a temp table. What is the size of the files?
You'll know quickly if you have bloat.

HTH,
Tracy

-----Original Message-----
From: ProfoxTech [mailto:profoxtech-boun...@leafe.com] On Behalf Of
Desmond
Lloyd
Sent: Monday, October 05, 2020 9:49 AM
To: profoxt...@leafe.com
Subject: My memo field(s) Are Bloated......

Good Morning!

Have a very wide dbf with about 8 memo fields .   Started
acteing strange like if I tried to re-index or pack memo it would seem to
go so far and then freeze!

Found a utility that seemed to repair the file.  The only issue is that the
DBF is 710mb and the associated memo file is 1.2gb!  Yikes. Have ran a
pack memo (with no  issues) and there is no change in the size of the memo
file,  which I am assuming is boated.

1.)  Is the memo file really bloated.
2.)   Going to look at the individual memo fields and see if any one of the
records has an unusually large memo field.
3.)  Open to any other suggestion that you wise people could recommend....

TIA/Regards,
Desmond


--- StripMime Report -- processed MIME parts ---
multipart/alternative
   text/plain (text body -- kept)
   text/html
---

[excessive quoting removed by server]

_______________________________________________
Post Messages to: ProFox@leafe.com
Subscription Maintenance: https://mail.leafe.com/mailman/listinfo/profox
OT-free version of this list: https://mail.leafe.com/mailman/listinfo/profoxtech
Searchable Archive: https://leafe.com/archives
This message: 
https://leafe.com/archives/byMID/2595f775-1de1-9b12-be0e-d4dd01871...@gmail.com
** All postings, unless explicitly stated otherwise, are the opinions of the 
author, and do not constitute legal or medical advice. This statement is added 
to the messages for those lawyers who are too stupid to see the obvious.

Reply via email to