>Hello lists,
>
>I want to get all the files on some a webdir.For example:
>
>http://www.foo.com/bar/
>
>But that dir has a default page "index.htm".So when I accessed the url
I only got the >default page.
>
>Can you tell me is there a way to fetch all the files in that
dir?Thanks a lot.
Hi,
As
On 2007-02-13 14:35:40 -0800, [EMAIL PROTECTED] (Nathan Vander Wilt) said:
I have a list of data that I want to operate
concurrently on, as many threads as I have processors.
So I wrote a bit of code that uses a semaphore to halt
spawning of new threads when the limit are already
running.
I hav
I have memory problems programming with perl: "out of memory!"
I have to process a lot of xml files which are in different directories
(more than 2 files in 110 directories). The files are quite small
(almost all of them are smaller than 100KB).
Here is some code:
code
#
On Wed, 2007-02-14 at 12:37 +0100, Arantxa Otegi wrote:
> I have memory problems programming with perl: "out of memory!"
>
> I have to process a lot of xml files which are in different directories
> (more than 2 files in 110 directories). The files are quite small
> (almost all of them are s
Arantxa Otegi am Mittwoch, 14. Februar 2007 12:37:
> I have memory problems programming with perl: "out of memory!"
>
> I have to process a lot of xml files which are in different directories
> (more than 2 files in 110 directories). The files are quite small
> (almost all of them are smaller t
Ken Foskey wrote:
On Wed, 2007-02-14 at 12:37 +0100, Arantxa Otegi wrote:
I have memory problems programming with perl: "out of memory!"
I have to process a lot of xml files which are in different directories
(more than 2 files in 110 directories). The files are quite small
(almost all of
Arantxa Otegi wrote:
I have memory problems programming with perl: "out of memory!"
I have to process a lot of xml files which are in different directories
(more than 2 files in 110 directories). The files are quite small
(almost all of them are smaller than 100KB).
Here is some code:
##
Arantxa Otegi wrote:
> I have memory problems programming with perl: "out of memory!"
>
> I have to process a lot of xml files which are in different directories
> (more than 2 files in 110 directories). The files are quite small
> (almost all of them are smaller than 100KB).
> Here is some co
Yesterday it was slow, today I can't even load any page completely.
Is it me or is there a problem with the site?
Jenda
= [EMAIL PROTECTED] === http://Jenda.Krynicky.cz =
When it comes to wine, women and song, wizards are allowed
to get drunk and croon as much as they like.
-- Te
Must be you just checked and it is still there...
It might be a little slow as the few people that use perl seem to be able to
need a lot of help on how t use it causing a huge load on the servers :-)
Regards,
Rob
On 2/14/07, Jenda Krynicky <[EMAIL PROTECTED]> wrote:
Yesterday it was slow,
On 2/14/07, Rob Coops <[EMAIL PROTECTED]> wrote:
It might be a little slow as the few people that use perl seem to be able to
need a lot of help on how t use it causing a huge load on the servers :-)
Ouch! lol, I haven't been on there in months =)
--
WC (Bill) Jones -- http://youve-reache
I am working on a script to back trace debian package (program) dependencies.
Here is how the systems works
Package A needs packages D,E & F installed before it will install
Package D needs package G, H & I installed before it will install
Package G needs package C, J & K installed before i
On 02/14/2007 08:35 PM, Tony Heal wrote:
I am working on a script to back trace debian package (program) dependencies.
Here is how the systems works
Package A needs packages D,E & F installed before it will install
Package D needs package G, H & I installed before it will install
Package
13 matches
Mail list logo