[EMAIL PROTECTED] Only the first request = 'Invalid direct reference to form login page' but not the rest.

2008-07-23 Thread Rob Stewart
Hi.
 I've set up a pair of Apache Tomcat servers (6.0.16) (using
simpleTcpCluster for session replication) being reverse proxied by an
Apache HTTPD server (2.2.8) all on the same machine.

 I have a java servlet that requires authentication. The servlet when
accessed directly on either one of the Tomcats works fine - normal
operation being...

User goes to servlet URL.
Server redirects to login page.
User sends j_security parameters (Yes, this is FORM authentication).
If authenticates to valid user server redirects to originally requested URL.

 However, when I access the same servlet through the reverse proxy
after restarting the servers (this only happens with the VERY FIRST
request on the servers) I get this...

User goes to servlet URL.
Server redirects to login page.
User sends j_security parameters.
Server reports "Invalid direct reference to form login page".

 This error happens once then all works well even following the exact
same steps from another instance of the same browser (after closing
the browser to clear any session cookies). It happens even if you use
WGET to perform the same operations, so I believe this rules out
browser quirks too. It happens in IE and Firefox - all on only the
first request to the proxy.

 I've also discovered that if I perform the steps described above
direct to either of the Tomcat instances before going through the
proxy then the problem then does not occur at all. Which makes me
think it's a proxy/HTTPD problem.

 I've checked the Apache v2.2.9 changelog and I can't see anything
that may be relevant, if this is a bug. However, if anyone knows
different let me know.

 Any help much appreciated. Let me know if you need more details or if
I should be talking to the tomcat group.

-- 
Rob ([EMAIL PROTECTED])

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] Apache 2.2.8

2008-07-23 Thread Silvio Siefke

Hello,

I have the Apache 2.2.8 today on my Root Server (Debian etch) compiled. 
No problems with the Apache by compile.



./configure --prefix=/usr/local/apache --enable-modules=all --enable-ssl
--with-ssl=/usr/local/ssl --enable-so --enable-rewrite --enable-suexec
--with-suexec-caller=www-data
--with-suexec-docroot=/usr/local/apache/htdocs --with-suexec-uidmin=1000
--with-suexec-gidmin=1000
--with-suexec-logfile=/usr/local/apache/logs/suexec_log



Unfortunately, he refuses strictly CGI scripts. User rights and CHMOD 
set but always 403 no other entry in the logs. He also refuses to 
settings from the extra directory to. Signatur Off or the virtuell 
Hosts. The signing off, he not even in httpd.conf.


Have your advice? I have three days to the server and can not continue. 
Would I really look.



Greetings
Silvio


-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [EMAIL PROTECTED] Apache 2.2.8

2008-07-23 Thread Eric Covener
On Wed, Jul 23, 2008 at 8:15 AM, Silvio Siefke <[EMAIL PROTECTED]> wrote:
> Hello,
>
> I have the Apache 2.2.8 today on my Root Server (Debian etch) compiled. No
> problems with the Apache by compile.
>
> 
> ./configure --prefix=/usr/local/apache --enable-modules=all --enable-ssl
> --with-ssl=/usr/local/ssl --enable-so --enable-rewrite --enable-suexec
> --with-suexec-caller=www-data
> --with-suexec-docroot=/usr/local/apache/htdocs --with-suexec-uidmin=1000
> --with-suexec-gidmin=1000
> --with-suexec-logfile=/usr/local/apache/logs/suexec_log
> 
>
>
> Unfortunately, he refuses strictly CGI scripts. User rights and CHMOD set
> but always 403 no other entry in the logs. He also refuses to settings from
> the extra directory to. Signatur Off or the virtuell Hosts. The signing off,
> he not even in httpd.conf.
>
> Have your advice? I have three days to the server and can not continue.
> Would I really look.

Configuration, failing URL, and error log would be a better start.

-- 
Eric Covener
[EMAIL PROTECTED]

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [EMAIL PROTECTED] different kinds of proxies

2008-07-23 Thread André Warnier

[EMAIL PROTECTED] wrote:

On 7/22/08, Rich Schumacher <[EMAIL PROTECTED]> wrote:

Solprovider,

While I agree with your sentiment that forward proxies can be very
dangerous, I think you are jumping the gun with your statement doubting they
have "any legitimate use today."

Here is a a real-world example that I use at my current job.  My employer
operates a series of websites that are hosted in servers all around the
country.  A couple of these servers are located in Canada and run a site
specifically geared towards Canadian customers.  As such, they have Canadian
IP addresses.  A while back we wanted to inform our Canadian customers who
visited our non-Canadian site that we have a site specifically for them.  We
easily accomplished this using the MaxMind geoIP database and could display
whatever information we wanted when we detected a Canadian IP.  The quickest
way to QA this was for us to setup a proxy (Squid, in this case) and point
our browsers at it.  The server was already locked down tight with iptables,
so all we had to do was open a (nonstandard) port to our specific gateway
and we were all set. Doing this we can now masquerade as a Canadian customer
and QA can make sure it works as planned.

Forward proxies can also be used as another layer of cache that can greatly
speed up web requests.

Hope that clears the air a little bit as I feel there are several good
examples where forward proxies can be useful.

Cheers,
Rich


Thank you.  I was wondering if anybody noticed the question at the end
of my post.  I am truly interested in the answer.

How would you have handled this if forward proxies did not exist?
Your answer was the forward proxy helped testing, not production.  QA
could test:
- using real Canadian addresses.
- using a network with specialized routing to fake Canadian and
non-Canadian addresses.
- faking the database response so specific addresses appear Canadian.
Did the production system require using a forward proxy?

I discourage using IP Addresses to determine geographical locations.
Slashdot recently had an article about the inaccuracies of the
databases.  (IIRC, an area of Arizona is listed as Canadian, which
might affect your system.)  I checked the IP Addresses of Web spam to
discover that recent submits were from:
- Moscow, Russia (or London, UK in one database)
- Taipei or Hsinchu, Taiwan
- Apache Junction, AZ.
Some databases place my IP Address in the next State south.  "Choose
your country" links are popular on the websites of global companies.
(I dislike websites that force the country choice before showing
anything useful.  If the website is .com, assume the visitor reads
English and provide links to other languages and country-specific
information.)

I believe cache does not depend on forward proxy.  Any Web server with
cache enabled should serve static pages from the cache without
configuring a proxy.  Specific scenarios with a front-end server
specifically for cache seem more likely to use a reverse proxy.  While
this is how a recent project for a major website handled cache, I do
not have good information about general practices.

Am I missing something?  Other ideas?

solprovider



Hi. Me again butting in, because I am confused again.
When users workstations within a company's local network have browsers 
configured to use an internal "http proxy" in order to access Internet 
HTTP servers, is this internal proxy system a "forward" or a "reverse" 
proxy ?
I am not talking here about a generic IP Internet router doing NAT, I am 
talking specifically about a "web proxy".  This HTTP proxy may also do 
NAT of course, but its main function I believe is to cache pages from 
external servers for the benefit of internal workstations, no ?
If this is a forward proxy, then I do not understand the comment of 
Solprovider that seems to indicate that such things are obsolete and/or 
dangerous.  At any rate, they are in use in most corporate networks I am 
aware of.


André


-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] force compressed chunked encoding as response from apache.

2008-07-23 Thread Michal
Hallo group members

I need apache sent an HTML page to my browser in compressed form as chunked 
encoding. So far I managed to compress HTML page content with following line in 
/etc/httpd/conf/httpd.conf:

AddOutputFilterByType DEFLATE text/html text/plain text/xml

...but content-length tag is used here of course.

My question is how to force apache to use chunked encoding?

regards,
Michal

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] Profiling apache 2.0.55 using Gprof

2008-07-23 Thread Paras Fadte
Hi,

I have been trying to profile apache 2.0.55 using Gprof by compiling
it with "-g -pg" option.The mpm used is worker. MaxRequestsPerChild is
changed to 1 in httpd.conf and I Start apache with "httpd -X" option
and make a single and then stop it.The gmon.out file produced does
show some details but when I make multiple requests say , 10 thousand
requests , although the number of calls are recorded it doesn't show
up time used by those function calls. It shows up as 0.00 . If the
server takes lets say , 10 minutes to serve/process the ten thousand
requests , shouldn't it be reflected in the gprof output also ? could
it be multithreading issue because of  worker mpm? or does gprof only
shows time for a single function call rather than a cumulative amount
of time used by all the calls made to a function ?

Any help will be appreciated and thanks in advance.


-Paras

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [EMAIL PROTECTED] IP based virtual hosting and security

2008-07-23 Thread Scott Gifford
"César Leonardo Blum Silveira" <[EMAIL PROTECTED]> writes:

[...]

> So, my question is: how safe is it to let the other interface listen,
> even if it will not respond correctly to any request? What is the
> potential for security vulnerabilities in the 8080 port of the other
> interface?

There actual threat from doing this is very small if both IP addresses
are publicly accessible.  Connecting to an address with no site
configured for it will probably exercise a different code path than
connecting to an address with a site, but it's likely to be small and
not very risky.

However, if your Web server is not public, or you are doing any kind
of IP address-based access control (perhaps at a firewall), you would
want to be careful to ensure that the same access rules applied to
both of your IP addresses.  Any public Web server represents some
risk, and if that alternate IP address bypasses your access control
and makes your otherwise private Web server public, it could be a bit
risky.

Bottom line: It's probably very slighly safer to avoid listening on
that IP address at all, but only very slightly.

Hope this helps,

Scott.

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] Help on Syn flood with Apache

2008-07-23 Thread Arnab Ganguly
Hi All,
I am using Apache 2.2 with mpm model as worker in RedHat 3.0.
When I do a dmesg from the command prompt I get lot of the below message

possible SYN flooding on port 84. Sending cookies.
possible SYN flooding on port 82. Sending cookies.
possible SYN flooding on port 81. Sending cookies.
possible SYN flooding on port 84. Sending cookies.

Those are the listening the ports of the Apache.I am having 4 different
instances are running.When I do a netstat of lsof on a particular port I see
SYN_RCV is taking 50 % of the connections, which may have caused the  kernel
to throw the "SYN_FLOOD_ATTACK", correct me if I am wrong.
I have the following configuration  net.ipv4.tcp_max_syn_backlog = 1024 and
net.ipv4.tcp_syncookies = 1 and net.ipv4.tcp_keepalive_time = 7200

So what would be the workaround for the above scenario, and what is the main
issue it is causing the above behavior.Any help would be very much
appreciated.
Thanks in advance
Regards
Arnab


Re: [EMAIL PROTECTED] Apache 2.2.8

2008-07-23 Thread Silvio Siefke

Eric Covener schrieb:

Configuration, failing URL, and error log would be a better start.


I'm sorry, here come the config and Logs:

URI:
http://www.silvio-siefke.de

CGI:
http://www.silvio-siefke.de/cgi-bin/mailer/mailer.cgi

httpd.conf
http://www.silvio-siefke.de/trading/httpd.html

extra/httpd-default.conf
http://www.silvio-siefke.de/trading/httpd-default.html

access_log
http://www.silvio-siefke.de/trading/access.html

error_log
http://www.silvio-siefke.de/trading/error.html


The Apache ignored the entries in the default. The  Include command is 
available in the httpd.conf.
In the log are no entries in the log suexec is not recorded. The rights 
are similar.



Thanks for help.


Greetings
Silvio



-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] Location Problems

2008-07-23 Thread Chris Howell
I am using apache, to run Trac and things have been working well. What I 
am trying to do is to bring another web app online using trac.


What I want my users todo is to access track by:

http://MachineName:9000/WorldView/

And the other by

http://MachineName:9000/ReviewBoard/

The location in my httpd.conf file looks like this.


   PythonPath "sys.path + ['/django/trunk/django'] + 
['/reviewboardsrc/reviewboard/djblets']"




   Satisfy all
   SetHandler mod_python

  
   SetEnv DJANGO_SETTINGS_MODULE reviewboard.settings
  
   PythonHandler django.core.handlers.modpython

   PythonAutoReload Off
   PythonDebug Off
   # If you run multiple mod_python sites in the same apache
   # instonce, uncomment this line:
  # PythonInterpreter reviewboard

   AuthType Basic
 AuthName "ReviewBoard"
 AuthUserFile "C:/Program Files/Apache Software 
Foundation/Apache2.2/conf/passwd"

 Require valid-user
 Order deny,allow

   # Serve static media without running it through mod_python
   # (overrides the above)


Every time I try and load app I get a 500. Does my basic configuration 
look right or is there some glaring error ?


The other thing that is useful is that The ReviewBoard pages are at:

C:\SomeDirectory\ReviewBoard and not directly under the Htdocs sub 
directory ? Do they need to be?  Or have I missed something or is my 
problem else where?


Cheers
Chris

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] .htaccess and PHP

2008-07-23 Thread Skip Evans

Hey all,

I'm new to the list and am having some issues with 
a RewriteRule I've applied in an .htaccess file. 
Or perhaps not the rule, but with using .htaccess 
in general.


What I wanted to do was allow users to enter a URL 
like the following:


http://varsitybeat.com/wi/madison

and then have my PHP/MySQL application receive 
this URL in the index.php file, and then get the 
wi and madison values from the $_GET array.


To do this I have the following in the .htaccess file.

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^([^/]+)/([^/]+) 
/index.php?st=$1&sc=$2 [NC]


And this is successful in accomplishing the goal.

In the index.php file I can use

$_GET['st'] to get 'wi', and $_GET['sc'] to get 
madison, if someone enters the URL


http://varsitybeat.com/wi/madison

into their browser. The problem I have now, 
though, and that really surprises me, is that if 
this .htaccess file is in place, the application 
no longer picks up its style.css (cascading style 
sheet), or the JavaScript AJAX files, which are 
included in a header.html file that index.php 
reads in.


How exactly the style sheet and JS files are read 
in is not anything unusual, just the regular 
syntax in the  section of an html file.


But the main point is that when the .htaccess file 
is in place, they are not accessed, and when it is 
not they are.


Can anyone direct me where to begin researching 
this kind of issue? I'm at a bit of a loss where 
to begin.


Thanks!

--
Skip Evans
Big Sky Penguin, LLC
503 S Baldwin St, #1
Madison, WI 53703
608-250-2720
http://bigskypenguin.com
=-=-=-=-=-=-=-=-=-=
Check out PHPenguin, a lightweight and versatile
PHP/MySQL, AJAX & DHTML development framework.
http://phpenguin.bigskypenguin.com/

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [EMAIL PROTECTED] .htaccess and PHP

2008-07-23 Thread Gene LeDuc

Hi Skip,

I'm not an expert, but I've been messing with mod-rewrite a bit 
recently.  My guess is that you need a rewrite condition like this before 
your RewriteRule:

  RewriteCond %{REQUEST_METHOD} ^GET

Regards,
Gene

At 02:48 PM 7/23/2008, Skip Evans wrote:

Hey all,

I'm new to the list and am having some issues with a RewriteRule I've 
applied in an .htaccess file. Or perhaps not the rule, but with using 
.htaccess in general.


What I wanted to do was allow users to enter a URL like the following:

http://varsitybeat.com/wi/madison

and then have my PHP/MySQL application receive this URL in the index.php 
file, and then get the wi and madison values from the $_GET array.


To do this I have the following in the .htaccess file.

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^([^/]+)/([^/]+) /index.php?st=$1&sc=$2 [NC]

And this is successful in accomplishing the goal.

In the index.php file I can use

$_GET['st'] to get 'wi', and $_GET['sc'] to get madison, if someone enters 
the URL


http://varsitybeat.com/wi/madison

into their browser. The problem I have now, though, and that really 
surprises me, is that if this .htaccess file is in place, the application 
no longer picks up its style.css (cascading style sheet), or the 
JavaScript AJAX files, which are included in a header.html file that 
index.php reads in.


How exactly the style sheet and JS files are read in is not anything 
unusual, just the regular syntax in the  section of an html file.


But the main point is that when the .htaccess file is in place, they are 
not accessed, and when it is not they are.


Can anyone direct me where to begin researching this kind of issue? I'm at 
a bit of a loss where to begin.


Thanks!

--
Skip Evans
Big Sky Penguin, LLC
503 S Baldwin St, #1
Madison, WI 53703
608-250-2720
http://bigskypenguin.com
=-=-=-=-=-=-=-=-=-=
Check out PHPenguin, a lightweight and versatile
PHP/MySQL, AJAX & DHTML development framework.
http://phpenguin.bigskypenguin.com/

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



--
Gene LeDuc, GSEC
Security Analyst
San Diego State University 



-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [EMAIL PROTECTED] .htaccess and PHP

2008-07-23 Thread Matt
if the "header file is read in by php" means that it is an include,
that doesnt matter
it is the form of the URL that the user_agent requests that matters

so say the user_agent requests index.php, then that php file includes
header.html
and that the resulting HTML is something like




the user_agent will make a GET request to the server of

http://2ndlevel.example.com/styles/stuff.css
http://2ndlevel.example.com/scripts/stuff.css

which will be picked up by your rewrite rule and will become

http://2ndlevel.example.com/index.php?st=styles&sc=stuff.css

so either your index.php must know how to send the appropriate
content-type header (and other headers: caching, etag, etc...)
or you must adjust the conditions under which the rewrite rule will
fire to prevent such content from  being handled by your script.

Usually you only want to redirect non-existent-directories and
non-existent-files to your index,php handler, so you can do this using

Options +FollowSymlinks
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^/]+)/([^/]+) /index.php?st=$1&sc=$2 [NC]


or by adjusting your regular expression to be more specific, either to
only include certain URLs,  or to exclude certain URLs, the choice is
yours, but at present your ([^/]+) is insufficent, as it only looks at
structure of the URL, not whether the specific resource should be
passed via the script, so for instance it would redirect

http://2ndlevel.example.com/blah/'%20OR1=1
to
http://2ndlevel.example.com/index.php?st=blah&sc='%20OR1=1

which might not be what you are expecting.

I would certainlu concentrate on whitelisting in your URL rewriterule,
being quite specific (more specific than just checking for
nonexistence) and then be double sure your php file only handles
legitimate types of request, because now you are shortcircuiting some
of the hard won apache handling with your own code.

you could for instance do

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^([^/]+)/\.(css|html?|js)$ /index.php?st=$1&sc=.$2 [NC]

which still requires filtering but only acts on URLs that end with
certain file extensions.

Hope that helps.



On Wed, Jul 23, 2008 at 10:48 PM, Skip Evans <[EMAIL PROTECTED]> wrote:
> Hey all,
>
> I'm new to the list and am having some issues with a RewriteRule I've
> applied in an .htaccess file. Or perhaps not the rule, but with using
> .htaccess in general.
>
> What I wanted to do was allow users to enter a URL like the following:
>
> http://varsitybeat.com/wi/madison
>
> and then have my PHP/MySQL application receive this URL in the index.php
> file, and then get the wi and madison values from the $_GET array.
>
> To do this I have the following in the .htaccess file.
>
> Options +FollowSymlinks
> RewriteEngine on
> RewriteRule ^([^/]+)/([^/]+) /index.php?st=$1&sc=$2 [NC]
>
> And this is successful in accomplishing the goal.
>
> In the index.php file I can use
>
> $_GET['st'] to get 'wi', and $_GET['sc'] to get madison, if someone enters
> the URL
>
> http://varsitybeat.com/wi/madison
>
> into their browser. The problem I have now, though, and that really
> surprises me, is that if this .htaccess file is in place, the application no
> longer picks up its style.css (cascading style sheet), or the JavaScript
> AJAX files, which are included in a header.html file that index.php reads
> in.
>
> How exactly the style sheet and JS files are read in is not anything
> unusual, just the regular syntax in the  section of an html file.
>
> But the main point is that when the .htaccess file is in place, they are not
> accessed, and when it is not they are.
>
> Can anyone direct me where to begin researching this kind of issue? I'm at a
> bit of a loss where to begin.
>
> Thanks!
>
> --
> Skip Evans
> Big Sky Penguin, LLC
> 503 S Baldwin St, #1
> Madison, WI 53703
> 608-250-2720
> http://bigskypenguin.com
> =-=-=-=-=-=-=-=-=-=
> Check out PHPenguin, a lightweight and versatile
> PHP/MySQL, AJAX & DHTML development framework.
> http://phpenguin.bigskypenguin.com/
>
> -
> The official User-To-User support forum of the Apache HTTP Server Project.
> See http://httpd.apache.org/userslist.html> for more info.
> To unsubscribe, e-mail: [EMAIL PROTECTED]
>  "   from the digest: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>
>



-- 
Matthew Farey
w: +44(0)208 4200200 (ext 2181)
bb: +44(0)7500802481
m: +44(0)7773465550
(sms to my laptop): +44(0)7917368497

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[EMAIL PROTECTED] Bulk SSL certificate purchases

2008-07-23 Thread Ben Spencer
We are looking to purchase a bulk number of SSL certificates for a variety
of sites. As we were discussing this with our current SSL certificate
provider, we ran into something which sounds a little odd, and wonder if
others have run into this with their bulk SSL certificate purchases.

When the individual SSL certificate is purchased, we need to specify how
many "hosts" are involved. Simple when dealing with standard apache sites.
Add a load balancer in front of things, and this is where we find things a
little odd as we would end up paying for a cert for each of the back end
servers -- even if the SSL cert is only on the front end load balancer.

Example 1:
If there is a load balance with three hosts behind it. The load balancer is
the only one with an SSL cert, we need to specify 3 hosts when we buy the
cert and end up paying 3 times a single cert cost. Say the cert costs $100.
We would end up paying $300 for the SSL cert for the load balancer because
there are 3 servers serving the site as $100 a pop.

Example 2:
Say we have 3 sites on the same domain (prod.domain.com, test.domain.com,
dev.domain.com) and all three happen to run through the load balancer/proxy
with prod having 2 back end servers and test & dev each having 1 server
(which might simply be different apache virtual host on the same physical
host), we would either need:
  Prod: 2 hosts (load balanced)
  Test: 1 host (proxy only)
  Dev : 1 host (proxy only)

The cost of four SSL certs would be needed (4 x $100 = $400). Should we have
chosen to use a wildcard cert (*.domain.com) at $200 a cert, we still would
need the cost of 4 wildcard certs ($800 total).

Does this seem the standard pricing for the industry?

Benji Spencer
System Administrator

Moody Bible Institute
Phone: 312-329-2288
Fax: 312-329-8961




smime.p7s
Description: S/MIME cryptographic signature


Re: [EMAIL PROTECTED] .htaccess and PHP

2008-07-23 Thread Skip Evans

Hey Matt,

(I just sent you the message off list, but now 
rereading this again, I'm starting to understand.)


I see that I'm affecting all the URLs, including 
the ones the app is initiating and that's what's 
breaking stuff.


But what if I want the rule to ONLY take affect 
when the URL ends with a '/' char, as in the case of


http://varsitybeat.com/wi/madison/

That's the only time I need the rule to kick in, 
when they give me a city and school name on the 
URL, and this is also the only time a URL will end 
 with a '/'.


What would you change on this one?

Options +FollowSymlinks
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^/]+)/([^/]+) 
/index.php?st=$1&sc=$2 [NC]


...which seems closest yet, to only make it apply 
the URL to URL's ending in '/' ?


Thanks,
Skip





Matt wrote:

if the "header file is read in by php" means that it is an include,
that doesnt matter
it is the form of the URL that the user_agent requests that matters

so say the user_agent requests index.php, then that php file includes
header.html
and that the resulting HTML is something like




the user_agent will make a GET request to the server of

http://2ndlevel.example.com/styles/stuff.css
http://2ndlevel.example.com/scripts/stuff.css

which will be picked up by your rewrite rule and will become

http://2ndlevel.example.com/index.php?st=styles&sc=stuff.css

so either your index.php must know how to send the appropriate
content-type header (and other headers: caching, etag, etc...)
or you must adjust the conditions under which the rewrite rule will
fire to prevent such content from  being handled by your script.

Usually you only want to redirect non-existent-directories and
non-existent-files to your index,php handler, so you can do this using

Options +FollowSymlinks
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^/]+)/([^/]+) /index.php?st=$1&sc=$2 [NC]


or by adjusting your regular expression to be more specific, either to
only include certain URLs,  or to exclude certain URLs, the choice is
yours, but at present your ([^/]+) is insufficent, as it only looks at
structure of the URL, not whether the specific resource should be
passed via the script, so for instance it would redirect

http://2ndlevel.example.com/blah/'%20OR1=1
to
http://2ndlevel.example.com/index.php?st=blah&sc='%20OR1=1

which might not be what you are expecting.

I would certainlu concentrate on whitelisting in your URL rewriterule,
being quite specific (more specific than just checking for
nonexistence) and then be double sure your php file only handles
legitimate types of request, because now you are shortcircuiting some
of the hard won apache handling with your own code.

you could for instance do

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^([^/]+)/\.(css|html?|js)$ /index.php?st=$1&sc=.$2 [NC]

which still requires filtering but only acts on URLs that end with
certain file extensions.

Hope that helps.



--
Skip Evans
Big Sky Penguin, LLC
503 S Baldwin St, #1
Madison, WI 53703
608-250-2720
http://bigskypenguin.com
=-=-=-=-=-=-=-=-=-=
Check out PHPenguin, a lightweight and versatile
PHP/MySQL, AJAX & DHTML development framework.
http://phpenguin.bigskypenguin.com/

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
  "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]