Home page logo

nanog logo nanog mailing list archives

RE: decreased caching efficiency?
From: woods () weird com (Greg A. Woods)
Date: Fri, 20 Oct 2000 18:54:53 -0400 (EDT)

[ On Friday, October 20, 2000 at 11:59:58 (-0700), Travis Grant wrote: ]
Subject: RE: decreased caching efficiency?

Where does caching make sense? Static assets that can be stored on the edge
closest to customers ie images, audio, video. These needs are served by your
Akamais, IBeams, and Evokes.

Certainly content distribution makes sense for really busy sites that
have significant quantities of such content to distribute, but that's
only the sending half of the picture.  ISPs will continue to deploy
transparent cache servers whenever they can justify the savings (be it
in raw bandwidth costs alone, or in combination with softer savings in
helping manage user expectations, etc.)

When 10% efficiency at peak loads is 10% less bandwidth you buy and your
bandwidth costs are high enough that 10% means a free mid-range server
every month, you can afford to do a *lot* of caching.  10% of $2k/month
isn't always a worthwhile savings given the potential costs of achieving
it, but 10% of $25k/month makes the CFO take notice!

Turn that into more like $80,000/month (as would be the case for an ISP
on the far side of the globe that's trying to justify a fast low-latency
terrestrial connection) and you'll really be seeing the full picture!

One other place where it makes sense to cache,
in your Corporate environement.

Yup, though there the cache is just a free side-effect of running a
proxy server that you've got to run in the first place.  If you use
Squid or some equivalent instead of just a raw NAT or other non-caching
proxy server then all you need is a bit more disk and a bit more memory
and you're instantly seeing at least some savings.

Note that a transparent cache running in a corporate proxy server can
usually get much higher savings than any such cache in an average ISP
environment can get too (I've personally seen 50% on enough occasions to
prove that it's actually reducing peak bandwidth needs by at lest 30%!).

Not only that but remember that the corporate gateway cache can often be
peered with the ISPs cache, further improving things noticably....

All this really means, BTW, is that cache-ignorant webmasters will be
forced to learn or loose, and the more they learn the more we all save!

So, yes, there might be a lot of newbie webmasters out there creating
uncachable content these days, but they will be forced to learn that
they're cutting off their noses to spite their own faces!

And by the way saving bandwidth is not justified for the majority of the
market. 80% of deployed sites are sucking less than 2Mbps in monthly fees.
Most caching implementations will cost way more than the bandwidth costs
they avoid.

Not if you're a hosting provider and you aggregate those savings over
many customers!  :-)

A pair of Squid machines running as accelerators in front of a farm of
virtual hosting servers will work wonders at reducing the load on those
back-end servers.  You still charge your customers based on actual
traffic out your pipe (from the squid logs), but now you can squeeze
more small sites onto fewer bigger and more efficient (cheaper to run)

Furthermore since the savings granted by transparent caches at the
last-hop provider (or corporate gateway) are literally "free" for the
hosting provider, they're absolutely a major benefit!

                                                        Greg A. Woods

+1 416 218-0098      VE3TCP      <gwoods () acm org>      <robohack!woods>
Planix, Inc. <woods () planix com>; Secrets of the Weird <woods () weird com>

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]