Home page logo

nanog logo nanog mailing list archives

Re: decreased caching efficiency?
From: Daniel Senie <dts () senie com>
Date: Fri, 20 Oct 2000 10:36:43 -0400

Several folks have made sweeping statements that website
owners/designers must make changes to live with caches, or their sites
will suffer. One problem with that statement is, there are LOTS of
caches and each has its own idiosyncracies. Further, they can be locally
tuned. Bill Simpson already stated publicly that he tunes his caches to
ignore anything that says it isn't to be cached, for example.

The whole mess reminds me a great deal of route filtering. Someone,
somewhere might filter your route from their BGP feed. This might cause
a large number of people to have no access to content provided by a
server farm somewhere. How's the server owner (or farm owner) to know?
They don't, until some prospective user sends a note that a site is
"always down."

Technologies which mess with the end-to-end nature of the 'net, and
especially which are transparent to end users and far from content
providers, invariably make the 'net less useful and less reliable.

Daniel Senie                                        dts () senie com
Amaranth Networks Inc.                    http://www.amaranth.com

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]