mailing list archives
Re: Securing a webserver through reverse proxy?
From: "Adam McCarthy" <adam () blackox net>
Date: Wed, 19 Feb 2003 13:26:34 -0600
I've read about a way to secure webservers, which must not be directly
exposed to the Internet, using a reverse proxy, e.g. MS ISA Server or
Squid on a UNIX box.
I have yet to use ISA Server for this, but using Squid is an *excellent*
method for keeping the actual HTTP server isolated from public access.
Now my question would be: Has anyone experience with that? Is it really
more secure (compared to firewalling and port forwarding)? Is the MS ISA
Server Webpublishing rule (which implies reverse caching) doing an
application layer filtering or does it just the mentioned caching? Can a
Squid reverse proxy solution fulfill that too?
Again, not sure on using ISA, but using Squid has been an very nice option.
Basically in a layout with a lot of Windows boxes running IIS as the actual
content/web servers, with private ip connections to a public Squid server
has been an excellent solution. Of course, this should not be your only
means of securing the network, but since your web services are probably
public, this extra step should help you out tremendously.
If not, what are the steps necessary to accomplish this?
I don't know the exact meaning of this question...maybe worded wrong. Here
is a link to a good SANS article that covers the general scope and usage of
reverse proxy implementation, and covers a brief overview of how to
implement with Apache...really adjusting the process here is not that
difficult to apply to IIS as the backend web server.
If you need more detailed information or advice, I can be contacted directly
and we can go over a setup more specific to your environment. Personally I
prefer Squid since this is what I have used for this exact type of scenario
and the cost comparison to licensing an ISA server...well Squid is free so
you can't beat that.
adam () blackox net
Your input is appreciated.