As mentioned in a previous blog there are some downsides to the popularity of WordPress (WP). There are millions of WP sites out there with vulnerabilities which make them the targets of malicious hits from bad actors and bots. Robotic spammers will fill the comments sections with links in an attempt to boost the rankings of their clients’ scam websites. Hackers attempt to log onto the site and take it over, or to access back doors through flawed plugins or themes and inject their own viral code to redirect viewers to there sites or launch further attacks on other unwitting sites and networks.
I’m sad to say that after recent experience I believe that perhaps over 70% of internet traffic is garbage or downright evil. I am not talking about the junk we look at willingly (sorry LOL cat aficionados), but spam and malware distributed by bots.
The first order of business was to prevent the slowdowns which were beginning to occur. I moved all the sites to a Virtual Private Server (VPS) and upped the RAM allocation until server resets due memory overload were no longer a problem. Since I pay buy the megabyte for server memory it is in my interest to reduce the load as much as I can so that I can set that RAM allocation as low as possible.
Next was the security issue. The measures I took in that direction are detailed in a previous article. Blocking most of the spammers and hackers significantly reduced hits on the sites, but occasional overloads still happened when they were hammering away. Some of the security tools also consume server memory when they are at work blocking the bad guys.
To streamline the defence system required some tweaking. I used the following plugins.
Counterize is a handy traffic monitoring plugin. It’s very useful for seeing where your web hits are coming from and what they look at before they leave. You can even see what keywords they were searching for when they found your site. All that tracking uses server processing power so turn off all the functions you are not using. The country of origin and IP trackers were useful for identifying repeat offenders. Looking over all the Counterize logs I made a list of malicious IPs.
Bulletproof Security (BPS) is a well respected plugin that identifies malicious code and viruses when the server is queried by a client. But this still requires WP to do some work. So analyzing and blocking dozens of hits per minute also slowed things down. BPS allows you to put code in the .htaccess file. (under the custom code tab). Those directives are executed on the server level and don’t require the WP install to run PHP, so this is where I put my list of bad IPs. While I was at it I added some speed boosting code , code to protect user information, and a bit more bug screening. (note to BPS: these features should be clickable options. The whole configuration process is needlessly complicated)
Bad Behavior (BB): By default BB blocks spammers and logs the information about them. Once I was confident the security measures I had taken were working I turned off the logging function to speed things up.
WP sites are by their nature very demanding of a server, because WP uses PHP to dynamically create the pages every time a viewer requests to see them. For this reason caching is recommended to take some of the load off. First thing I turned on PHP caching for the whole server in Dreamhost’s control panel. But each WP install also requires a caching plugin which will produce a static file which is much faster to serve.
I first tried WP Super Cache. It did the job, but as it also writes to the .htaccess file so there was some conflict with BPS. I finally settled on Zen Cache, which is recommended by the makers of my favourite theme (Weaver II Pro) and it works just fine.
With the above methods, server crashes were eliminated but analysis of the traffic showed occasional bursts of activity were still causing slowdowns. IP look up revealed these to be coming from Bingbot and Googlebot as well as other search engines, business directories, and link aggregators. We don’t want to block legitimate search engines from crawling our pages, so how do we limit the hits to a minimum and lock out the crawlers which don’t do us any good? I’ll address those questions in a future article.