The short version

We put an nginx caching proxy server in front of our wordpress mu install and sped it up dramatically ? in some cases a thousandfold. I?ve packaged up a plugin, along with installation instructions here ? WordPress Nginx proxy cache integrator.

The long version

Here at blogs.law.harvard.edu, our wordpress mu was having problems. We get a fair amount of traffic (650k+ visits/month), ? combine that with ?bots (good and bad) ? and we were having serious problems. RSS feeds (we serve many from some pretty prominent blogs.) are expensive to create, files are gatewayed through PHP (on wpmu), and letting PHP dynamically create each page meant we were VERY close to maxing out our capacity ? which we frequently did, bringing our blogs to a crawl.

WordPress ? as lovely as it is ? needs some kind of caching system in place once you start to see even moderate levels of traffic. There are manymany high quality and well-maintained options for caching ? however, none of them really made me happy, or fit my definition of the ?holy grail? of how a web app cache should work.

In my mind, caching should:

  1. be high performance (digg and slashdot proof)
  2. light-weight
  3. be structured to avoid invoking the heavy application frameworks it sits in front of. If you hit your app server (in this case, wordpress) ? you?ve failed.
  4. be as unobtrusive as possible: caching should be a completely separate layer that lives above your web apps
  5. have centralized and easily tweaked rules
  6. be flexible enough to work for any type (or amount) of traffic.

So I decided to put a proxy in front of wordpress to static cache as much as possible. ALL non-authenticated traffic is served directly from the nginx file cache, taking some requests (such as RSS feed generation) from 6 pages/second to 7000+ pages/second. Oof. Nginx also handles logging and gzipping, leaving the heavier backend apaches to do what they do best: serve dynamic wordpress pages only when needed.

A frontend proxy also handles ?lingering closes? ? clients that fail to close a connection, or that take a long time to do so (say, for instance, because they?re on a slow connection). Taken to an extreme, lingering closes act as what might be known as a ?slow loris? attack, and without a frontend proxy your heavy apaches are left tied up. With a lightweight frontend proxy, you can handle more connections with less memory. Throw a cache in the mix and you can bypass the backend entirely, giving you absolutely SILLY scalability.

On nginx ? it?s so efficient it?s scary. I?ve never seen it use more than 10 to 15 meg of RAM and a blip of CPU, even under our heaviest load. Our ganglia graphs don?t lie: we halved our memory requirements, doubled our outgoing network throughput and completely leveled out our load. We have had basically no problems since we set this up.

To make a long story short (too late), I packaged this up as a plugin along with detailed installation and configuration info. Check it out! Feedback appreciated: WordPress Nginx proxy cache integrator.