I'm designing a web site that will display MARC authority files onscreen. I
use a Perl hash that's tied to a (read-only) Berkeley DB_file, and it works
nicely. How practical is this approach if there's going to be moderate
traffic on a site?

My DB_FILE is about 200MB, but of course Perl brings only small pieces of
the database into memory at any one time. Would the site bog down if people
were accessing records at the rate of, say, every few seconds? Should I
consider mySQL instead? I'd prefer to stick to DB_FILE, since it's so easy
and elegant -- and I can easily create complex data structures. 

What if one of my data files was significantly bigger (say, a GB or two of
MARC book records)? I don't have a feel for the pros and cons of the various
approaches to accessing large databases using Perl, but tied hashes are
pretty fast! In any case, I know I'll have to lock the file during each
read, via "flock" or the like. I haven't tried implementing the latter yet.

Does anyone have any ideas about this? Are there other Perl forums I should
investigate regarding this topic?

Many thanks!

- Chris Morgan

Reply via email to