Peter, Please accept my apology for the slow reply but my access to this List can be sporadic.
I'm sure you are more wise than I am in this regard as I've gleaned many SQL tidbits from your posts to this List. As for SQLite being ubiquitous on moblie devices, in my own opinion it's due more to it's price, like mySQL and gmail. If people had to pay $5 a month for the privilege I think suddenly the numbers would shift. Just because a billion people do, doesn't mean it's the best. Why is my phone noticeably slower, after start-up, no apps running, just basic navigation, even though it's just under 30% full, compared to when it was new? Is it because the dbs that keep track of all that stuff have grown considerably in size? Is it because the data output is much larger, instead of a page of 8 icons, it's now always pages of 16 icons? I have three particular 3rd party desktop apps that use SQLite; two store gps data, the third is a home inventory program. I love the home inventory program, it has a lot of cool features, like looking up products based on a bar code it sees via my isight camera. But as I add more and more photos, and the data gets larger and much larger, the program gets slower and slower. Bad code on behalf of the vendor that could be tweaked better, probably, but that would be evidence that you can get it wrong. Earlier this year I took 130 million data points from the gps database and fed them into my own SQLite and Valentina dbs. Basically I needed to find the gradient that represented optimum rate of climb. To do this I needed to locate all hills, sift out those that where downhill rather than uphill, remove any instance where drafting was a factor and then seasonally, time of day, compare to extract trends. The sql statements were not that complex, just a LOT of toing and froing between LC and the db. I soon abandoned SQLite as it was clear that Valentina was getting the answers quicker. SQLite still did an excellent job, it was just slower. Maybe it has more to do with the speed LC talks to each db. I used the LC db commands for SQLite whilst I've migrated across to using API calls for Valentina. You are undoubtedly correct that the Mac has used SQLite for years, and I guess the reason for the major upgrades to iTunes and iPhoto that require complete rebuilds of the db and no turning back, is because these dbs are continually being tweaked for performance - and they do a brilliant job. But it must be easier to update a db when you know exactly what data you are dealing with is, ie as cameras have grown from 5 megapixel to 20 megapixel, and added gps position. But what about Spotlight where the data is presented in a unknown vast assortment? I can do a search on 'Borrower' and 'Latitude', because Spotlight can look into the home inventory and gps dbs. How could Apple possibly have known that such fields would be added and whether the field would hold a constant 2 chars or 0-2K chars or a blob of binary. As far as I know there is limit of 62 dbs which SQLite can attach to. How close is Spotlight to reaching that limit, I've no clue, but Versions has just added one more, and as anything approaches it limit, it slows. As for Industrial Strength I'd go with 10s of millions of records and Terabytes of data. My desktop Mac has 9TB of internal storage running at 60% and 8TB of external, mainly for backups. When Spotlight chokes (even pre Lion) I assume they didn't envisage it would be indexing 10s of millions of Lats, Longs & Altitudes hidden amongst Terabytes of data. In Lion the number of dbs has increased, the data Versions deals with is unknown, and and what with Auto Save and Versions, Spotlight is being triggered to index more often. Also the definition of the Mac users seems to have narrowed. I think many on this List use their Macs in vastly different ways than the average Mac user, maybe outside the optimum way the various dbs have been set up to track their every move and keep them within the walls. It wont be the No1 factor, but I feel it is a factor. But I'm often wrong. And I see why, once all these dbs are set-up, indexed and connected, there should be very neglible processing required to just keep updating the data. Yet why, on my wife's Lion machine (I don't run Lion yet) which has been set up since Christmas, have I been prevented, on a couple of occassions now, from using Spotlight because it's in the middle of Indexing*** - is it some kind of catch22, the system will be fully responsive once Indexing stops, but it keeps indexing because Version keeps changing things. *** Oh and I've just discovered another change to Lion to add to the catch-22. Previous to Lion, setting up a TimeMachine HD would automatically exclude it from Spotlight Indexing. In Lion that doesn't happen, and if you try to exculde the HD you are met with a dialog that tells you that Spotlight will CONTINUE to Index your backups, but anything else on the disk (which Apple recommends you don't do) will not be. So now Spotlight Indexes when Lion Auto Saves, Versions Saves, TimeMachine back-ups + any User induced Saves. Previously it was only during user induced saves. I'm assured that Linux is a zippier resource dieted OS. Does it have a variety of SQLite dbs, all interconnected, and keeping track of your every move? Granted, OS X's iCandy is more at fault here. _______________________________________________ use-livecode mailing list use-livecode@lists.runrev.com Please visit this url to subscribe, unsubscribe and manage your subscription preferences: http://lists.runrev.com/mailman/listinfo/use-livecode