On 1/30/06, Chris Knipe <[EMAIL PROTECTED]> wrote:
> >> Last time I checked, perl's threads wasn't very popular to use.  Now
> >> that's
> >> a discussion on it's own I guess, and not the intensions of this email to
> >> get into.  I'm planning to develop a rather large perl application.  Due
> >> to
> >> complexity, I plan to run multiple processes, each process being spawned
> >> from a single main process...  Is there any way that I can share data
> >> between them?
> >
> > Lots of ways. Here's one that's worked for me. The shared data live in
> > some convenient location, like a directory, a file, or a database,
> > depending upon your needs. A little glue code in a module handles
> > access to the data in a consistent way, maybe using something from the
> > Tie::* hierarchy to provide a simple interface.
>
> Hi,
>
> Thanks for the suggestion.  This has been recommended to me by someone off
> the list as well (or something relatively close to it), and unfortunately is
> not going to be very efficient.  It's going to kill the system as far as
> disk IO is concerned.  I'm talking about 200+ variables here, about half of
> which will change approximately every 10ms (some even less).  Doing 100 odd
> disk writes/reads every 10ms, plus more than likely searching through open
> files for a specific variable, and closing it in time so that it can be
> written again... Don't think it will be feasable.
>
> I'll need to do this in memory I'm afraid... :(
>
> --
> Chris

You might consider the following modules:

Cache::Memcached - client library for memcached (memory cache daemon)
POE - portable multitasking and networking framework for Perl
IPC::Shareable - share Perl variables between processes
Cache::RamDisk - Sharing of Perl Objects Between Processes on Several RAM Drives

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to