Two weeks ago we transferred our business site,, to a new host with a downtime of less than 15 minutes. The transfer of course took longer than that. Many people asked how we pulled this off. Keep reading on to find out!

Preparing the "ductwork" – the things you don't see as a visitor

The most important aspect of moving your site to a new host is DNS, the system which resolves domain names to IP addresses. All hosts offer their own DNS servers. These are usually very slow to propagate. This means that once you change your DNS server it will be several minutes –or even days– until your domain name resolves to the new IP. This causes a massive downtime and lost business. For this reason we chose to transfer our DNS to Amazon Route 53. Other good candidates are DynDNS, DNS Made Easy and so on. We switched our DNS three days before the big move to make sure that every ISP and caching DNS server would be aware of the change.

Another thing we had to do before the move was transfer the SSL certificate. That was the easiest part of the process. We simply asked the old host to send us the private key and we forwarded that along with the certificate itself and the intermediate certificates (the latter two provided by our SSL certificate provider) to the new host.

A big site like ours doesn't run on good will and pixie dust. There are a ton of CRON jobs which take care of all the regular maintenance we have to carry out. Before moving the site we made sure that we transferred the CRON jobs to the new server. This is a manual process. Even if there was an automatic way to transfer CRON jobs between hosts the paths are different, therefore manual intervention was necessary anyway.

The other oft-forgotten part of a site transfer is email accounts. Since I know better than to trust a generic web/database/mail/DNS server box to store our business email all of our email accounts were actually set up as forwarders. When you send me an email in my address the server forwards it to my Google Apps account. Therefore transferring the email accounts was as easy as... manually creating the forwarder addresses on the new server. If you are using regular email inboxes on your site it's best to use POP3 to download the messages locally. Better yet, wise up and use a third-party email service with forwarders or custom MX records to store your business emails outside of your web server. It's not prudent to put all your eggs in one basket.

Transferring the site

The next step was performing a full site backup from the old host. This was a massive, 7 Gb backup taken with Akeeba Backup Professional, split in 500 Mb parts and uploaded to Amazon S3. Once done, I used Akeeba Kickstart Professional on the new server to import the backup archive parts and restore the site. At this point it was four hours since the start of the full site backup. Naturally, the live site (on the old host) and the restored site (on the new host) have diverged. We need to fix that before going live with the new host.

First, we set the old site off-line with a message that we will be back soon. This is the start of the site's downtime. Now we have to move fast!

I took a full site backup excluding large stats tables with not-so-valuable data (I can live with four hours worth of lost download logs and statistics about tickets not opened because the auto-reply solved the issue for the user) and put it to S3 using Akeeba Backup Professional. On the new server I used Akeeba Kickstart Professional to restore this partial backup. The restoration process keeps existing content. Therefore, we are restoring the partial backup on top of the already restored full site. This simply makes the site up to date with the latest changes. Now we're nearly 14 minutes into the process.

Going live

The next step is quite easy. Log in to the back-end of the restored site and set it on-line. Then log in to Amazon Route 53 and switch the IP address to point to the new server. After less than 10 seconds the IP change has propagated across the Internet (we checked with four different Internet connections in three countries). The site is fully transferred and the downtime clock has stopped. 15 minutes 35 seconds. Not bad, huh?