Posts tagged help

Best Security Practices for WordPress

Dont look surprised when your wordpress site is hacked, is wordpress security so weak?

No, not really, its just popular and as such there are more vectors to attack its security, still as a heavy wordpress user i can give you some good tips to keep your site secure, ill divide this into setup the site and securing the site, once you should only need to do once, the other its best to be ongoing, im also assuming that you have your server setup correctly and secured as well as your wordpress is up to date and your computer is secure as well, if those are good, then what i say bellow will keep you 99% safe!

Secure on Setup

On wp-config.php when you install on $table_prefix  = ‘whateveryouwant’ put a random string! – This will prevent mysql injections that might target the default wp_ table prefix (if already installed use something like phpadmin to go into the database and change the prefix there and then add it to the wp-config.php file)
On wp-config.php under define(‘WP_DEBUG’, false); put define(‘DISALLOW_FILE_EDIT’, true); – This prevents editing of php files under wordpress, most people dont edit them anyways (i just login with sftp and edit directly), so people trying to exploit will have more difficulty doing so
Use a strong password – I know its silly to say but a strong unique password with lots of letters, numbers and characters is always a good thing.

On First Login

Login with your default admin account, create a new account with admin privileges and then delete the old admin account – This prevents login requests or brute-force that would go directly to account number 1 or admin account
Disable user registration, go the options panel and disable user registration – If you dont intent for other users to post, there is no point in allowing registration.
Install only the Plugins you Need – Even if disabled, only have plugins and themes that you need, they could be used

Security Plugins to Install

BruteProtect or Login LockDown – To Prevent login attempts and brute force attacks (or in alternative find a Two-Step Authentication plugin).
Install a Clean Theme – Make sure you get a nice free theme from WordPress.org or a paid from a good provider and keep it up to date, the more complex the theme the more likely it will have code that might become insecure, so get a good one and keep it updated.
Advance Automatic Updates – Will keep your wordpress install and plugins up to date!
Akismet – It comes with WordPress for a reason, before it, wordpress comments were horrible and plagued with tons of spam.

Extra!

Please pleassseeee make backups, dont trust your webhost, make your own, thats the only true way of being 100% secure, use a plugin for it, i like BackUpWordpress and Keep Backup Daily, but any you like will do!
Use Cloudflare or Incapsula – These give pleanty of extra features, like cdn but they also filter and protect your traffic from a lot of nasty stuff on the web.
Wordfence or Better Wp Security – If you want more heavy security, its totally optional and in my opinion if you are well locked down they dont add anything!
Use htaccess to lock in wp-admin if you are the only user, search for this on Google pleanty of sites explaining.
Use WordPress Jetpack plugin it protect you from some security flaws and it will help on automatic plugin installs, plus a ton of other things
Use Mx Toolbox or Sucuri Site Check to check if your site has been exploited!

The best rule of all is to be prepared for the worst, have backups and check from time to time to see if your site is up to date and everything is running fine, most of these are automated but its best to always keep an eye and if everything breaks just clean everything and put back a backup 🙂

How to Protect your Sites

Well one of my sites was taken down for a couple of hours after it was completely screwed from a hack (well from script kiddies, but still), that deleted admin accounts and posts and added re-directs and other nasty stuff, cleaning it up would mean several hours and some things might be completely lost forever anyways, so what to do? before this happened, during or after to fix it, so what do i do to keep my sites online and protected, ill separate these into 3 major points:

Preventive Protection (before any problem)

  • Always have the latest updates to your online software, yes i know sometimes it brings new bugs, but most of the times its better to take the time to find workarounds and still update to the latest than opening yourself to an attack;
  • Always have multiple backups, all my hosts have backups but i also make my own to other servers (weekly) as well as a to my own computers (montly),  this ensures that even if there is a catastrophically bad failure (your host dies on you or deletes your account) that you are still able to bounce back pretty quickly;
  • Make sure your hosting is separate from your domains, since keeping those 2 together means if you need to jump to another host that you will always have problems (also have always a backup host that you like, and trust to jump to quickly if need be);
  • Use popular software, yes it might be a bigger target for hacks and security issues, but the chance of having updates and fixes is also much larger;
  • Resilient Hosting, doesn’t need to be cloud hosting or some strange arrangement, just needs to be from good hosting companies with good track records, they ensure that most hardware/server failures will never happen and if they did, that a fix would be done quickly and efficiently

Immediate Protection (when you first detect the problem)

  1. Put the site Offline, if you are on a apache server it normally means an update to the htaccess/htpasswd, you don’t want your users getting affected by your compromised site;
  2. Check to see how was the site compromised, was it the server, a bad admin, software flaw, try and find how did this happen;
  3. After you find out the flaw, search and see if there is a fix to it (server/software update), banning an admin, whatever it is, cause after you fix it, you need to make sure it doesn’t happen again.

Reactive Protection (how to fix the problem)

  • Best way is always, just delete the whole site and bring back the latest stable backup, sure you will lose some content or news but you have a guarantee that your site comes back crisp and clean, fixing it by hand means you can miss something and still keep your site compromised;
  • Make a test run and check if everything is alright, make sure to make the necessary adjustments before bringing the site back online;
  • Fix the security issue, if you found out what was the problem, go ahead and do the updates or workarounds, so this doesn’t happen again;
  • Make a brand new backup immediately before bringing the site back on, this ensures that if the site is still vulnerable, that you can bring it back up quickly, without much loss.

So that’s it, yes i know its basically using backups, and yes there are other ways, but this is the easiest more efficient way to protect your site from premature death ^_^

How to Deal with Web Scraping

Hummmm since i have several galleries, one thing i encounter often is web scrapers, personally i don’t mind if anybody takes the entire gallery home or even if they re-post it somewhere else, that is fine with me, this is the web, if i wanted the gallery to be private i would made it so, if its public and free then go right ahead…



However the content itself is not the problem, the problem here is that the vast majority of web scrapers has bad default settings or the users that use them put too aggressive settings, its not uncommon for the load on the server to go from 0.10 to 1 in a heartbeat or even go down, i know its partly my fault i personally like to restrict as little as possible the server or the software (i could use several methods to restrict connections or to ban ip’s if there are too many connections), however because i don’t, sometimes i get into trouble, so this is what i normally do.

First of all i have this on the .htaccces (mod_rewrite), that helps blocking most of scrapping software (unless it spoofs as a browsers hehehe):

RewriteEngine OnRewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]

RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR]

RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]

RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]

RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]

RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]

RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]

RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]

RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]

RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]

RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]

RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]

RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]

RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]

RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]

RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]

RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]

RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]

RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]

RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]

RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]

RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]

RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]

RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]

RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]

RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]

RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]

RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]

RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]

RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]

RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]

RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]

RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Zeus

RewriteRule ^.* – [F,L]

I monitor the load of the server if the load as a spike for more than a couple of minutes i check the apache log, if there are lots of connections to the same site from the same ip, i ban this ip with .htaccess, adding this line:

Order allow,deny
Deny from 100.100.100.100
Allow from all

(100.100.100.100 is the ip on the logs) and check the load after a couple of minutes, if its down fine, if they jumped ip’s, i’ll do one of two things, if they keep on the same ip range i’ll just block that ip range, like so:

Order allow,deny
Deny from 100.100.100.
Allow from all

If they aren’t then i limit the connections to 10 on the cache or image directory (not on all the site), i know it will hurt all users but its better than nothing, just adding this line:

MaxClients 10

If it still persists, i’ll just close the image directory,adding this line again on the cache or image directory (depending on the type of image gallery software you are using):

deny from all

So the site stays up as well as the thumbnails, just the images won’t be accessible for a while, all of these are temporary measures, but for now, for me they do the trick, most of the times banning the ip is enough of a cure, and those i always leave on the .htaccess the other options i normally remove them the next day after the connections storm has passed, bottom line is if you want to scrape instead of bombing the server for an hour, make it so it downloads slowly during a couple of hours it makes a big diference and everyone gets what they want.

ImageBoard Spam List

Since i run several successful anonymous imageboards, one of the ways to prevent spam is to have a sort of spam list (i would say blacklist) of all the domains url’s that are not able to be posted on the board (since one of the main reasons to spam the boards is to post links to fishing or illegal sites), this way this prevents them being posted by bots altogether.

So its an essential tool to have a clean board and to keep things clean, even if later on more automated tools like akismet or defensio are included, this is still a nice clean and fast way to keep most spam or idiotic posts from the site, so the file is a simple spam.txt file on the root of the site, one domain/site per line, this of course is most useful to other imageboard hosts that use the same system (wakaba or kusaba clone software), so here is our very own custom spam.txt list hehehe for free yayy 1131 domains (i’m actually thinking about making our own system, so anyone can add links and have the latest version for their board so their system is always protected heheheh)

UPDATE (from tentakle and neechan):

Uhh since the last post we have added a bunch more, so there ya go an updated spam.txt this new list has 1372 domains that are/were used in spam posting on imageboards, the old link as well hehehe, you can use something like WinMerge or TextOpus to help merge your existing spam file with my awesome one ^^

How to Change Hosts Files

The hosts file is a computer file used to store information on where to find a node on a computer network. This file maps hostnames to IP addresses (like 10.0.0.1 points to www.google.com). The hosts file is used as a supplement to (or a replacement of) the Domain Name System (DNS) on networks of varying sizes. Unlike DNS, the hosts file is under the control of the local computer’s administrator (as in you). The hosts file has no extension and can be edited using most text editors.

The hosts file is loaded into memory (cache) at startup, then Windows checks the Hosts file before it queries any DNS servers, which enables it to override addresses in the DNS. This prevents access to the listed sites by redirecting any connection attempts back to the local (your) machine (ip address 127.0.0.1). Another feature of the HOSTS file is its ability to block other applications from connecting to the Internet, providing the entry exists.

So you can use a HOSTS file to block ads, banners, 3rd party Cookies, 3rd party page counters, web bugs, and even most hijackers, so here are some instructions to do so and some sites with already made hosts files (you just overwrite your own hosts file):

The host files location is:
Windows XP at c:\Windows\System32\Drivers\etc
Windows Vista at c:\Windows\System32\Drivers\etc
Windows 2000 at c:\Winnt\System32\Drivers\etc

There you will find a file named hosts (no extension), like we said above you can edit it with any text editor, and function is simple, you map ip addresses to hostnames, so the files will be mostly like this…

127.0.0.1    localhost
127.0.0.1    www.bad-spyware-site.com
127.0.0.1    www.site-with-virus.com
127.0.0.1    www.publicity-ads-site.com

if you want to add any domain, just add a new line right 127.0.0.1 for the localhost (this way when that domain comes up in the browser the browser will search for it on your computer and not online, because the hosts file told him that), so for example:

127.0.0.1    localhost
127.0.0.1    www.bad-spyware-site.com
127.0.0.1    www.site-with-virus.com
127.0.0.1    www.publicity-ads-site.com
127.0.0.1    google.com

so now if i put google.com on the address bar of the browser it will give me a blank page and google.com wont work anymore, if you want to delete a entry, just delete the line or put a # in front

127.0.0.1    localhost
127.0.0.1    www.bad-spyware-site.com
127.0.0.1    www.site-with-virus.com
127.0.0.1    www.publicity-ads-site.com
#127.0.0.1    google.com (google.com will work now)

so the idea is to use the hosts file to block unwanted or bad sites ^-^ clean and easy hehehe

Here are some sites that provide awesome host files ^_^ .oO (choose one of them)

Hostman : Its an automated hosts file updating software
Host File : Pretty cool and clean hosts file
Someone Who Cares : A compreensive hosts file
MVPS : A hosts towards blocking unwanted stuff