Incident Response for an SEO Spammed Website

Problem Report

Some unauthorized actor is adding spam HTML to the page. The timestamp of the modified files changes back to 2010 each time they are altered.

The added code looks like this:

<div style="position:absolute;top:-13201px;">rolex explorer,rolex,u boat,rado,zenith,<a href="http://www.doshopsells.net/rolex-watches-4.html">fake rolex for sale</a>,franck muller,rolex masterpiece,emporio armani,rolex milgauss,cartier,rolex yachtmaster,<a href="http://www.doshopsells.net/">replica watches</a>,a lange sohne,roger dubuis,chopard,breitling</div>

Hypothesis

Someone has guessed or stolen FTP credentials and is using FTP to upload modified files.

Root Cause Analysis

A political battle is usually fought at this point: a server admin wants to just jump on the first solution that comes to mind without understanding the problem first. In this case such a response would be to change the FTP password, replace the modified files, and call the job done.

That has a chance of fixing the problem, but it leaves these issues unresolved:

More careful incident response requires root cause analysis -- we need to know precisely what the attacker is doing. This requires the system administrator to accept allowing the attacks to continue for a while.

A hasty action like a simple password change is likely to alert the attacker that he (or she) has been detected, but not really keep them out. That may make things worse--the attacker may switch to a less detectable attack.

Setting a Trap

We don't have administrative access on the server. However, even without it we can still add network monitoring to it without being obvious about it.

Very Simple Reverse Proxy

I set up a Ubuntu 14.04 server in the cloud with iptables set to forward HTTP and FTP ports to the server, with these commands:
echo 1 >/proc/sys/net/ipv4/ip_forward

iptables -t nat -A PREROUTING -p tcp --dport 80 -j DNAT --to-destination 1.2.3.4
iptables -t nat -A PREROUTING -p tcp --dport 20:21 -j DNAT --to-destination 1.2.3.4
iptables -t nat -A PREROUTING -p tcp --dport 1025:65535 -j DNAT --to-destination 1.2.3.4
iptables -t nat -A POSTROUTING -p tcp -d 1.2.3.4 --dport 80 -j MASQUERADE
iptables -t nat -A POSTROUTING -p tcp -d 1.2.3.4 --dport 20:21 -j MASQUERADE
iptables -t nat -A POSTROUTING -p tcp -d 1.2.3.4 --dport 1025:65535 -j MASQUERADE

I have replaced the real server's IP address with "1.2.3.4".

Testing the Reverse Proxy

To test it, I set a my laptop to use the proxy with these steps. I've used "2.2.2.2" for my proxy's IP address, and "www.example.com" for the hacked website.

Edit /etc/hosts and add this line

2.2.2.2 www.example.com
Then flush the DNS cache. On Windows,
ipconfig /flushdns
On a Mac,
sudo killall -HUP mDNSResponder

Ping Test

I executed this command:
ping www.example.com
I got replies, as shown below:

Viewing Website

I opened the Web page and tested it--everything works, as shown below. (I redacted the page heavily to conceal the identity of the site.)

FTP Test

I connected to the FTP service to test it. I got to the banner, as shown below.

Vulnerable FTP Server

Wow, that is a really old FTP server, with serious known vulnerabilities!

Monitoring Network Traffic

I used tcpdump, with this command:
tcpdump -pn -C 100 -W 100 -w capture &
This will save packet captures in a series of 100 files, each limited to 100 MB in size, for a total size of 10 GB. My virtual server has a 20 GB hard disk that's only 11% full, so that should be fine.

Monitoring File Integrity

I used this Python script to fetch several files and output their hash values with timestamps:
#!/usr/bin/python

import os, urllib2, hashlib, time

pages = ["http://www.example.com/index.html", "http://www.example.com/aboutus.html", "http://www.example.com/registration.html", "http://www.example.com/schedule.html", "http://www.example.com/past.html", "http://www.example.com/tournaments.html", "http://www.example.com/rules.html", "http://www.example.com/gymdir.html"]

summary = ''
for page in pages:
response = urllib2.urlopen(page)
html = response.read()
h = hashlib.new('md5', html).hexdigest()
print time.strftime("%c") + '\t' + page + '\t' + h
summary += h

print "Summary: ", time.strftime("%c"), hashlib.new('md5', summary).hexdigest()

I saved that script as /root/tripwire.py

I executed this command to set up the cron job:

crontab -e
And added this line at the end:
*/1 * * * * /root/tripwire.py >> /root/triplog
The result is a file like this:
Wed Dec  2 16:15:01 2015	http://www.example.com/index.html	5ad98230901d54238308e97f989400f7
Wed Dec  2 16:15:01 2015	http://www.example.com/aboutus.html	366c4556a8225d7a86a08941af21fcf8
Wed Dec  2 16:15:01 2015	http://www.example.com/registration.html	8c996f780191a18382ee073e0591085c
Wed Dec  2 16:15:02 2015	http://www.example.com/schedule.html	a15136f01ed132e7edc502d6d48c947a
Wed Dec  2 16:15:02 2015	http://www.example.com/past.html	648e465998e07d738d93c43e9f83460f
Wed Dec  2 16:15:02 2015	http://www.example.com/tournaments.html	189e706a2d31430d003ce8ef8537a403
Wed Dec  2 16:15:02 2015	http://www.example.com/rules.html	26e7a3d8a549804ca80e5956290cd044
Wed Dec  2 16:15:02 2015	http://www.example.com/gymdir.html	588e0e7f5275086ff3e72329e68ef097
Summary:  Wed Dec  2 16:15:02 2015 46f002f88670ddf053608076e66dac70
Wed Dec  2 16:16:01 2015	http://www.example.com/index.html	5ad98230901d54238308e97f989400f7
Wed Dec  2 16:16:01 2015	http://www.example.com/aboutus.html	366c4556a8225d7a86a08941af21fcf8
Wed Dec  2 16:16:02 2015	http://www.example.com/registration.html	8c996f780191a18382ee073e0591085c
Wed Dec  2 16:16:02 2015	http://www.example.com/schedule.html	a15136f01ed132e7edc502d6d48c947a
Wed Dec  2 16:16:02 2015	http://www.example.com/past.html	648e465998e07d738d93c43e9f83460f
Wed Dec  2 16:16:02 2015	http://www.example.com/tournaments.html	189e706a2d31430d003ce8ef8537a403
Wed Dec  2 16:16:02 2015	http://www.example.com/rules.html	26e7a3d8a549804ca80e5956290cd044
Wed Dec  2 16:16:02 2015	http://www.example.com/gymdir.html	588e0e7f5275086ff3e72329e68ef097
Summary:  Wed Dec  2 16:16:02 2015 46f002f88670ddf053608b76e66dac70
Greping that file for "Summary" will make it easy to spot a change, if the attacker returns.

Recommendation

At this point, I contacted the administrator of the website with these recommendations: Hopefully this will leave the attacker kicked out and locked out, unable to regain control of the server with their current skill level.

References

tcpdump(8) - Linux man page

How to redirect traffic to another machine in Linux


Last modified: 12-2-15 2 pm