it is actually RHL7.1 CDs, because i386 contain files for 2 CDs and
it is too difficult to know which of the files belong to CD1 or CD2. So it
is easier to download the to Files (seawolf-i386-disc1.iso~673MB and
seawolf-i386-disc2.iso~669MB) from any of the ftp sites (also you will find
(seawolf-i386-powertools.iso~642MB).

then in the CD Burning program look for and option that will support .iso
format which will read that image file and extract it to a cd (CD will
become bootable by itself). i used Easy CD Creator by Adaptec and it is
almost compatible will most of the CD-R drivers

Fahad,

-----Original Message-----
From: Osazemoya Uhumuavbi [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 27, 2001 12:20 PM
To: '[EMAIL PROTECTED]'
Subject: CD burning on windows2000 HOW-TO!!!


Hi all,

Please can someone tell me how to burn redHat linux 7.1 CD in a windows 2000
enviroment. I have a problem burning the CD. I have a 650MB capacity CD-R
and I do not know which of the rpm packages to take out since the whole i386
folder is much larger than 650MB capacity CD-R. I have downloaded
successfully the redhat 7.1 /i386 and power tools. The capacity of the i386
folder is 1.02GB. Is there a format for taking out the rpm packages that
would not be useful during the initial installation of linux and if there is
can I know the format.

Thank you.

Osaz.

-----Original Message-----
From: Vilius Puidokas [mailto:[EMAIL PROTECTED]]
Sent: Thursday, April 26, 2001 3:08 AM
To: [EMAIL PROTECTED]
Subject: Re: Speed problem in scanning a directory with 500,000 files


if none of previously suggested ideas on making that number lower help,
i'd take a look at different fs or check ext2 source. at least i've seen
some doc mentioning that ext2 has pretty serious performance impacts if
you have >20,000 files in one dir.

at work we had similar problem (~40k other directories with unique names, 
so fitting them into hierchy was pain) - we just used our concept of
virtual server and stacked several virtuals on one physical box. yup, but
that's another going away from lots of files in one dir.
v

On Tue, 24 Apr 2001, Min Yuan wrote:

> Hello,
> 
> We have a directory on Redhat 6.2 with 500, 000 files. In our code we open
and read the directory and for each entry in the directory we use lstat() to
check for some information. The whole scanning takes more than eight hours
which is terribly long.
> 
> Is there any way we could reduce this length of time? If the answer is NO,
then is there any official documents about it and where can we find it?
> 
> Thank you!
> 
> Min Yuan 
> VytalNet, Inc.
> (905)844-4453 Ext. 241
> 



_______________________________________________
Redhat-devel-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-devel-list



_______________________________________________
Redhat-devel-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-devel-list



_______________________________________________
Redhat-devel-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-devel-list

Reply via email to