So I’ve been reading about the Apache Killer attacks. It involves misusing an HTTP feature that lets a web browser download segments of a document or file. It’s useful if only part of a file’s been downloaded. It’s sometimes exploited by accelerators to download several chunks of a file all at once. In theory it gets the user a page or big file faster, but hammers the server.

One fix was to simply block range requests entirely, but that broke the site.

Another option was to set a limit on range requests. That seems to work OK.

Better than OK. Before I set the limit my system was using just over 4GB of installed memory (out of 8). I’d had to increase the memory because it had been pushing 4 before the upgrade and had crashed several times. After setting the limit the memory use is under 2GB. AND the site is a whole lot snappier. Like it was months ago.

I’m gonna blame download accelerators.