Switch from UFW and fail2ban to CSF

Having played with CSF for a while on one server, I’ve decided I like it more than UFW and fail2ban. It seems much better at blocking mail bruteforce attacks and SSH as a distributed attack. So anyway, here’s a list of steps to achieve that, as much for my record as anything. The server is running Ubuntu 16.04, but these general steps should work anywhere. In addition the server I did it on is also running VestaCP, so there are a couple more steps for that.

Read more

Alternative DNS servers in the Philippines

Sometimes the default DNS servers for ISPs in the Philippines don’t work. Sometimes they get overloaded, sometimes they block certain sites and not others due to internal agendas (PLDT blocking Globe sites for eg.). So here’s a list of the alternatives.

Outside the Philippines

Google DNS

8.8.8.8
8.8.4.4
2001:4860:4860::8888 (IPv6)
2001:4860:4860::8844 (IPv6)

OpenDNS (includes protection against malware)

208.67.222.222
208.67.220.220

Norton DNS

198.153.192.50
198.153.194.50

Inside the Philippines

Globe Telecom

202.52.163.6
202.52.163.7
202.52.162.39 (ns1.globeidc.com.ph)
202.52.162.40 (ns1.globeidc.com.ph)
202.52.167.70
202.52.167.71
203.127.225.10 (g-net.globe.com.ph)
203.127.225.11 (g-net1.globe.com.ph)

PLDT

210.14.4.37
112.206.232.66
119.92.150.133
122.52.136.140
119.92.233.159
58.69.254.3/58.69.254.8

Smart Bro

121.1.3.81
202.57.96.3

Sky Cable

202.78.64.83
202.78.64.84

Viber SQlite Database Extraction / Forensics

Well you can use this to pull messages out of a Viber Desktop database. I personally had to do it when I re-installed Viber and it deleted my entire message history, which wasn’t so friendly, so I pulled this out of the database so I could refer to it in a spreadsheet. Maybe there is a way of pumping it back into the new database …

I may end up putting more queries here when I discover them but below you’ll find the basic one to dump out a summary of messaging. It converts the Unix timestamp dates into readable ones, and tells you which way the messsage went too. If you have any other such queries, let me know, and I’ll include them: there didn’t seem to be much on this around the internet.

How you run the query is up to you. I previously used an SQlite extension for Firefox, but recently found the DBeaver cross platform Database app, and its working a treat across sqlite, mysql, postgresql, so that’s my favourite right now. Just run the query and export to CSV, using a strange delimiter such as ‘|’ so the message isn’t split into many fields at each comma.
You could also run the query from the commandline: copy the command to a file, say viber.sql, and then use the following, suitable for a backup script perhaps.

sqlite3 -header -csv -separator '|' viber.db < viber.sql > output.csv

On a standard install on Linux you’ll find the viber.db under ~/.ViberPC//viber.db
Obviously make a copy first if you’re nervous. Here is the query:

SELECT strftime('%Y-%m-%d %H:%M:%S',Events."TimeStamp",'unixepoch') AS Time, 
 Contact.Name AS Name,
 Contact.ClientName AS AltName,
 Contact.Number As Cellphone,
 CASE Direction WHEN 0 THEN 'rcvd' ELSE 'sent' END AS Direction,
 Messages.Body AS Message
FROM Events 
 INNER JOIN Contact ON Events.ContactID = Contact.ContactID
 INNER JOIN Messages ON Events.EventID = Messages.EventID
ORDER BY Time;

If you have the same query for other versions of Viber and they’re different, let me know.

Command to find all image files which are not really image files!

Quick one this … so you’ve got a compromised webserver and you want to check the files on it. Many scanning tools will ignore images, but an image might not always be what it seems! Check them all with this command:

find /path/to/dir -regex ".*\.\(jpg\|png\|gif\)" -exec file {} \; | grep -i -v "image data"

If all is good, you won’t get any output. If your server is seriously borked, then you might see things like this …

./wp-content/uploads/2011/01/22.jpg: HTML document, ASCII text

This is a flag that the image is in fact a PHP file. Investigate!

If you get this kind of thing,
./wp-content/uploads/2011/01/221.jpg: Minix filesystem, V2, 46909 zones

its probably a bug in an old version of file, so check your OS version, copy the file to a more recent OS and try again.

Bash script to clean Bots out of Apache Logs

If you’ve ever spent some time looking at webserver logs, you know how much crap there is in there from crawlers, bots, indexers, and all the bottom feeders of the internet. If you’re looking for a specific problem with the webserver, this stuff can quickly become a nuisance, stopping you from finding the information you need. In addition, its often a surprise exactly how much of the traffic your website serves up to these bots.

The script below helps with both these problems. It takes stats of a logfile (apache, but should also work on nginx), makes a backup, counts the number of lines it removes and each kind of bot, and then repeats the new stats at the end. Copy the following into a file eg. log-squish.sh and run with the name of the logfile as an argument. eg cleanlog.sh logfile.log
You’ll definitely want to edit the LOCALTRAFFIC bit to fit your needs. You may also want to add bots to the BOTLIST. Run the script once on a sample logfile and then view it to see what bots are left …

#!/bin/bash

# Pass input file as a commandline argument, or set it here
INFILE=$1
OUTFILE=./$1.squish
TMPFILE=./squish.tmp

if [ -f $TMPFILE ] ; then 
    rm $TMPFILE
fi

# Check before we go ... 
read -p "Will copy $INFILE to $OUTFILE and perform all operations on the file copy. Press ENTER to proceed ..."

cp $INFILE $OUTFILE

# List of installation-specific patterns to delete from logfiles (this example for WP. Also excluding local IPaddress)
# Edit to suit your environment.
LOCALTRAFFIC=" wp-cron.php 10.10.0.2 wp-login.php \/wp-admin\/ "
echo
echo "-------- Removing local traffic ---------"
for TERM in $LOCALTRAFFIC; do
    TERMCOUNT=$( grep "$TERM" $OUTFILE | wc -l )
    echo $TERMCOUNT instances of $TERM removed >> $TMPFILE
    sed -i  "/$TERM/d" $OUTFILE
done
sort -nr $TMPFILE
rm $TMPFILE

# List of patterns to delete from logfiles, space separated
BOTLIST="ahrefs Baiduspider bingbot Cliqzbot cs.daum.net DomainCrawler DuckDuckGo Exabot Googlebot linkdexbot magpie-crawler MJ12bot msnbot OpenLinkProfiler.org opensiteexplorer pingdom rogerbot SemrushBot SeznamBot sogou.com\/docs tt-rss Wotbox YandexBot YandexImages ysearch\/slurp BLEXBot Flamingo_SearchEngine okhttp scalaj-http UptimeRobot YisouSpider proximic.com\/info\/spider "
echo
echo "------- Removing Bots ---------"
for TERM in $BOTLIST; do
    TERMCOUNT=$( grep "$TERM" $OUTFILE | wc -l )
    echo $TERMCOUNT instances of $TERM removed >> $TMPFILE
    sed -i  "/$TERM/d" $OUTFILE
done
sort -nr $TMPFILE
rm $TMPFILE

echo
echo "======Summary======="

#filestats before
PRELINES=$(cat $INFILE | wc -l )
PRESIZE=$( stat -c %s $INFILE )

#filestats after
POSTLINES=$(cat $OUTFILE | wc -l )
POSTSIZE=$( stat -c %s $OUTFILE )
PERCENT=$(awk "BEGIN { pc=100*${POSTLINES}/${PRELINES}; i=int(pc); print (pc-i<0.5)?i:i+1 }")

echo Original file $INFILE is $PRESIZE bytes and contains $PRELINES lines
echo Processed file $OUTFILE is $POSTSIZE bytes and contains $POSTLINES lines
echo Log reduced to $PERCENT percent of its original size.
echo Original file was untouched.

And here is a sample output.

~/temp $ ./log-squish.sh access.log.2017-09-03
Will copy access.log.2017-09-03 to ./access.log.2017-09-03.squish and perform all operations on the file copy. Press ENTER to proceed

-------- Removing local traffic ---------
5536 instances of wp-login.php removed
507 instances of \/wp-admin\/ removed
84 instances of wp-cron.php removed
0 instances of 10.10.0.2 removed

------- Removing Bots ---------
2769 instances of bingbot removed
2342 instances of Googlebot removed
2177 instances of sogou.com\/docs removed
1815 instances of MJ12bot removed
1651 instances of ahrefs removed
1016 instances of opensiteexplorer removed
578 instances of Baiduspider removed
447 instances of Flamingo_SearchEngine removed
357 instances of okhttp removed
295 instances of UptimeRobot removed
122 instances of scalaj-http removed
74 instances of YandexBot removed
60 instances of ysearch\/slurp removed
24 instances of YisouSpider removed
22 instances of magpie-crawler removed
9 instances of linkdexbot removed
7 instances of YandexImages removed
7 instances of SeznamBot removed
5 instances of rogerbot removed
2 instances of tt-rss removed
1 instances of SemrushBot removed
0 instances of Wotbox removed
0 instances of proximic.com\/info\/spider removed
0 instances of pingdom removed
0 instances of OpenLinkProfiler.org removed
0 instances of msnbot removed
0 instances of Exabot removed
0 instances of DuckDuckGo removed
0 instances of DomainCrawler removed
0 instances of cs.daum.net removed
0 instances of Cliqzbot removed
0 instances of BLEXBot removed

======Summary=======
Original file access.log.2017-09-03 is 19395785 bytes and contains 74872 lines
Processed file ./access.log.2017-09-03.squish is 15432796 bytes and contains 54965 lines
Log reduced to 73 percent of its original size.
Original file was untouched.

So you can see that around 20% of the traffic on here is crap. And now the log file is much easier to read.