On Wed, 29 May 2002 06:18, Cameron Moore wrote:
> * [EMAIL PROTECTED] (Patrick Hsieh) [2002.05.28 22:28]:
> > How does Linux support Xeon CPU currently?
> > I am considering to use dual P-III 1G or single Xeon 2.2G architecture.
>
> Consider the following pages:
>
> http://www.intel.com/eBusines
Patrick Hsieh wrote:
> Hello Glenn Hocking <[EMAIL PROTECTED]>,
>
> I am planing to run MySQL on either dual P3-1G or single Xeon 2.2G.
> The main board is smp architecture, therefore I may add another Xeon CPU
> in the future.
>
> I think MySQL has no problem on smp architecture, right?
> Just
Hello,
Is there any utility for http session benchmark? Say, I'd like to
emulate 150,000 http sessions to benchmark the firewall or load-balancer,
any utility recommended?
--
Patrick Hsieh <[EMAIL PROTECTED]>
GPG public key http://pahud.net/pubkeys/pahudatpahud.gpg
--
To UNSUBSCRIBE, email to
Hello,
Apache comes with 'ab'. Which means 'Apache Benchmark'.
>From the manpage:
"ab is a tool for benchmarking your Apache HyperText TransĀfer Protocol
(HTTP) server. It is designed to give you an impression on how performant is
your current Apache installation. This especially shows you how
Hello,
I'd like to raise the MaxClient to 1500 in the apache on Debian.
Am I supposed to rebuild the .deb package? How can I keep the max limit
when I use apt-get to upgrade my apache package?
--
Patrick Hsieh <[EMAIL PROTECTED]>
GPG public key http://pahud.net/pubkeys/pahudatpahud.gpg
--
Hello "Pim Effting" <[EMAIL PROTECTED]>,
Thanks. I'd not only to test the http request, but also the tcp session.
A http request to a html page could couse many tcp session. And I want
to test the tcp session limit on the firewall or server load balancer.
Basically, a http request could generate
Hi,
AB is still your friend. You can check remote hosts.
Example: I want to check the speed of the company's intranet webpage.
I'm going to request the index.php page 1000 times, with 5 concurrent
connections.
'ab -n 1000 -c 5 http://192.168.1.200/index.php'
It will output connectiondelays (ms)
Hi
> Example: I want to check the speed of the company's intranet webpage.
> I'm going to request the index.php page 1000 times, with 5 concurrent
> connections.
> 'ab -n 1000 -c 5 http://192.168.1.200/index.php'
this is quite a good tool. Don't know it before. But, how to protect one
server fo
Hi,
What do you use for a complete backup and recovery solution? I was looking at
mondo, but thought I would ask you guys.
thanks for the input.
bernie
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]
Heyas u tough hackers you
Okay, im deploying a phpgroupware to about 400 users on top of an old
box and debian woody.
Phpgw is a very heavy app and the machine is quite small (dual PII-400
Xeon, 1.5 GB) but i guess it will do for now.
Of course i didnt go out on a limb and put all 400
At 07:09 PM 5/29/2002 +0800, Patrick Hsieh wrote:
>Hello,
>
>Is there any utility for http session benchmark? Say, I'd like to
>emulate 150,000 http sessions to benchmark the firewall or load-balancer,
>any utility recommended?
httperf
on woody, apt-get install httperf
This tool is very useful t
On Wed, May 29, 2002 at 09:04:53AM -0500, Bernie Berg wrote:
> Hi,
>
> What do you use for a complete backup and recovery solution? I was looking at
>mondo, but thought I would ask you guys.
>
> thanks for the input.
tar without compression. Very standart and very fast to recover from
On Wed, May 29, 2002 at 03:36:26PM -0500, Bernie Berg wrote:
> > Other software has issues, like you may need 1 week to
> > recover the loss of
> > the data.
> >
>
> what about open files (is this an issue?). also is there a standard way to restore
>(or do you just use a rescue disk and unt
Hi Bernie.
Used to use a lot of tapes, which requires a lot of handling, tape head
cleaning , tape changing and usually requires on site access.
These days with big cheap hard drives, I set up on and off site backup
boxes and copy the data via network to these backup servers nightly or
more i
One thing, I would not go with a Xeon, only if I was planning a quad
then perhaps.
When you run many concurrent processes, the cache utilization goes down
since
you do so much context switching. Granted that recent kernels have
improved much,
but since you still dont have fully associative cache
On Wed, 29 May 2002 06:18, Cameron Moore wrote:
> * [EMAIL PROTECTED] (Patrick Hsieh) [2002.05.28 22:28]:
> > How does Linux support Xeon CPU currently?
> > I am considering to use dual P-III 1G or single Xeon 2.2G architecture.
>
> Consider the following pages:
>
> http://www.intel.com/eBusiness
Patrick Hsieh wrote:
Hello Glenn Hocking <[EMAIL PROTECTED]>,
I am planing to run MySQL on either dual P3-1G or single Xeon 2.2G.
The main board is smp architecture, therefore I may add another Xeon CPU
in the future.
I think MySQL has no problem on smp architecture, right?
Just difficult to make t
Hello,
Is there any utility for http session benchmark? Say, I'd like to
emulate 150,000 http sessions to benchmark the firewall or load-balancer,
any utility recommended?
--
Patrick Hsieh <[EMAIL PROTECTED]>
GPG public key http://pahud.net/pubkeys/pahudatpahud.gpg
--
To UNSUBSCRIBE, email to
Hello,
Apache comes with 'ab'. Which means 'Apache Benchmark'.
>From the manpage:
"ab is a tool for benchmarking your Apache HyperText TransĀfer Protocol
(HTTP) server. It is designed to give you an impression on how performant is
your current Apache installation. This especially shows you how m
Hello,
I'd like to raise the MaxClient to 1500 in the apache on Debian.
Am I supposed to rebuild the .deb package? How can I keep the max limit
when I use apt-get to upgrade my apache package?
--
Patrick Hsieh <[EMAIL PROTECTED]>
GPG public key http://pahud.net/pubkeys/pahudatpahud.gpg
--
T
Hello "Pim Effting" <[EMAIL PROTECTED]>,
Thanks. I'd not only to test the http request, but also the tcp session.
A http request to a html page could couse many tcp session. And I want
to test the tcp session limit on the firewall or server load balancer.
Basically, a http request could generate 5
Hi,
AB is still your friend. You can check remote hosts.
Example: I want to check the speed of the company's intranet webpage.
I'm going to request the index.php page 1000 times, with 5 concurrent
connections.
'ab -n 1000 -c 5 http://192.168.1.200/index.php'
It will output connectiondelays (ms),
Hi
> Example: I want to check the speed of the company's intranet webpage.
> I'm going to request the index.php page 1000 times, with 5 concurrent
> connections.
> 'ab -n 1000 -c 5 http://192.168.1.200/index.php'
this is quite a good tool. Don't know it before. But, how to protect one
server for
Hi,
What do you use for a complete backup and recovery solution? I was
looking at mondo, but thought I would ask you guys.
thanks for the input.
bernie
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]
Heyas u tough hackers you
Okay, im deploying a phpgroupware to about 400 users on top of an old
box and debian woody.
Phpgw is a very heavy app and the machine is quite small (dual PII-400
Xeon, 1.5 GB) but i guess it will do for now.
Of course i didnt go out on a limb and put all 400 p
At 07:09 PM 5/29/2002 +0800, Patrick Hsieh wrote:
Hello,
Is there any utility for http session benchmark? Say, I'd like to
emulate 150,000 http sessions to benchmark the firewall or load-balancer,
any utility recommended?
httperf
on woody, apt-get install httperf
This tool is very useful to generat
On Wed, May 29, 2002 at 09:04:53AM -0500, Bernie Berg wrote:
> Hi,
>
> What do you use for a complete backup and recovery solution? I was
> looking at mondo, but thought I would ask you guys.
>
> thanks for the input.
tar without compression. Very standart and very fast to recover from
On Wed, May 29, 2002 at 03:36:26PM -0500, Bernie Berg wrote:
> > Other software has issues, like you may need 1 week to
> > recover the loss of
> > the data.
> >
>
> what about open files (is this an issue?). also is there a standard way to
> restore (or do you just use a rescue disk and unt
Hi Bernie.
Used to use a lot of tapes, which requires a lot of handling, tape head
cleaning , tape changing and usually requires on site access.
These days with big cheap hard drives, I set up on and off site backup
boxes and copy the data via network to these backup servers nightly or
more if
29 matches
Mail list logo