Return to Digital Photography Articles

Offsite Backup

To provide the ultimate in protection for your data files, one should consider the benefits of online backup or remote backup. In an online backup, your data is automatically transferred to a geographically different location — one that puts your data safely out of reach of any disasters at home.

Remote / Online / Offsite Backup

You're proud of yourself for having backups of all your data... The backups might be on DVD, other hard drives or even a redundant RAID server. Now consider a scenario where your home has suffered a natural disaster (fire, flood, etc.) or theft -- are your backups truly safe? While some may dismiss these events as unlikely, the potential effects of losing your data could be devastating.

Enter the ultimate backup strategy: remote backup.

Benefits of Offsite Storage

There have been countless examples of people's homes being damaged in a fire, easily consuming any careful backups that may have been created. For many, the loss of their entire photo collection and documents is far more troubling than the process of replacing possessions through an insurance company. An even more likely scenario: theft. In a break-in, computer equipment and external drives are easy targets. If your backups are located anywhere near your computer, your risk of complete loss is high.

An offsite backup hides these backups in a completely different location, one that can easily be recovered should you have an incident at home.

Do-it-yourself Automatic Offsite Backup

Offsite backup can be accomplished in one of several ways: the manual method and the automatic method. In a manual method, you burn your backup jobs to DVD or to an external hard drive and walk these to some other location, such as your office, for storage. There's one problem: laziness! I am a firm believer that you shouldn't rely on yourself for a backup strategy -- as over time you will get lazy and fail to do it as often as you should. It's also a hassle.

Instead, this page will examine what it takes to implement an automatic offsite backup methodology.

For online backup, you have two choices:

  • Pay a company lots of money each month to provide remote storage to you, or
  • Provide your own!

Most people will be shocked to see just how expensive the offline backup sites are. If you are expecting to backup more than a few gigabytes, be prepared to spend a lot of money each month. For a 30 GB backup, you can easily spend $50 or more per month. Obviously, there are inherent advantages of paying someone else to host your backups (they'll take the blame if something goes wrong!)...

But if you take a few precautions, you too can create your own offsite backup for free. Obviously, you won't have a team of support staff paid to watch the blinking LEDs 24/7, or a SWAT team protecting the premises, so you'll have to weigh out the benefits carefully with your particular needs in mind.

Preferring to do things myself, I set out to create my own offsite backup setup.


Thankfully, creating an automatic offsite backup strategy is extremely easy and affordable. Most people already have all they need, with the exception of a Network Attached Storage (NAS) drive.

  • Backup Software
    You will need to use a suitable backup program for remote operation. There are many available, and very soon an article will be posted here describing a search for the best backup software.
  • Network Attached Storage (NAS) drive
    This must be a real NAS box, with FTP support. Don't get fooled by the cheaper Network Direct Attached Storage (NDAS) devices such as the Ximeta drives. You can easily find a NAS for less than $1 / gigabyte. In my case, 250GB was sufficient. I selected a Buffalo Linkstation Pro 250GB NAS, which will be the subject of another article soon.
  • Router
    Most routers will be suitable for this purpose. However, some models feature options that will make your setup even more secure: VPN tunnels and the ability to remap IP address ports.
  • Free Internet Access!
    You'll be reliant on a friend with an always-on internet connection, but unless they're nocturnal, most people don't use their connection in the dead of night.

Where should you put your NAS?

In order to house the equipment listed above, you'll need to find a willing partner... someone who has broadband internet that is always connected (e.g. Cable modem or DSL), and who you can trust. They shouldn't be the stingy type that has a problem with you using their internet connection when they're not on the computer. All this person has to do is allow you to connect your NAS box to their router and make some very slight modifications to the router settings.

Ideally, they aren't using their PC for heavy data transfers at night (if so, you'll have to configure your backup software for bandwidth throttling to be fair). Also, it will be much easier if they don't have their own FTP server that they are already running behind their firewall / router, exposed to the internet. If so, you'll need to reprogram the router to map your NAS box's FTP ports to something other than port 21 (using UPnP Forwarding or Port Forwarding).

Looking for a suitable place? Start with your friends and family. It is highly likely that you will find someone who is willing to let you place your NAS box at their site. You can also consider a trade: they provide you with a location and some of their night-time internet bandwidth, while you provide them the same in return!

All that said, you should avoid selecting your own home as the "suitable" location! Doing so will eliminate one of the benefits of offsite storage: protection against location disasters such as fire, flooding, theft, etc.

Practical Expectations for Offsite Backup

For standard residential cable and DSL internet access, upload bandwidth is often capped to data rates much lower than your download bandwidth. Remember that data rates are generally measured in kilobits per second, not kilobytes per second. Divide kbps (kilobits per second) by 8 to get KB/s (kilobytes per second).

Typical cable upload bandwidth = 128 kbps, 256 kbps or as high as 1000 kbps (if uncapped)
Typical DSL upload bandwidth = 64 kbps to 1500 kbps

In my environment, I am seeing roughly 360-600 kbps upload bandwidth from cable access. My download bandwidth is 600-1600 kbps.

Given the limited upload bandwidth, one should be realistic about what they hope to achieve with offsite / remote backup. Assuming that one automates the backup process for night time operation (outside of business hours) and that the backup should be completed by the next morning: (you'll have to adjust for your data rate and time requirements)

10 PM - 7AM = 9 hours = 32400 seconds @ 360 kbps = 1423 MB = 1.4 GB / night

This might be practical for a daily incremental backup, but it wouldn't be suitable for any full backups, which may be in the order of tens or even a hundred gigabytes. Make sure that you take into account that verification modes will probably require a download from your NAS, doubling the required bandwidth aggregate.

Full Local and Incremental Remote

In light of the upload bandwidth limitations, I run my first backup (the full backup) to the storage device (NAS) directly connected to my local LAN (or directly to my PC). Then, I move the NAS offsite and do all incremental backups remote over the internet using FTP. This way I can transfer the 90 GB of initial backup data onto the device in a matter of hours, but still have the benefit of offsite storage following this initial data transfer.

In a direct connection setup (Linkstation Pro via a Gigabit Ethernet NIC) mounting as a local network drive, I get approximately 10GB / hour with compression and AES encryption enabled. Note that to connect the Linkstation directly to my PC, I had to configure the static IP address of my local ethernet card to the same subnet as my Linkstation (e.g. 192.168.1.xx).

Digital Photos Import

Even with a incremental-only remote setup, I have to be aware of realistic daily storage limits when I am importing a day's worth of digital photos. It's not too uncommon for me to take more than a gigabyte of digital photos in a single day. Because of this, I would strongly recommend performing some of the intial filtering / discard in a non-backup folder hierarchy prior to the automated backup process.


Unfortunately, the very nature of offsite backup lends itself to security concerns. There are three areas to be concerned about:

  • Offsite server location
    Since your files are stored on a server in some other location, one naturally has to be aware of the fact that its possible for the remote location to also suffer the same problems your home may: theft, fire, damage, etc. If you are relying on a commercial online backup facility to host your backup server, then security is likely far in excess of what you could provide in a home environment. Safety is guaranteed by extra monitoring and redundant backups of your data. These additional services are the main value-add that a commercial online backup site will provide to you.
  • Offsite server hacks
    You have to expect that hackers are continually probing for open ports on the internet, attempting to find a poorly configured / protected server. If you are using a commercial online backup site, then there is an expectation that the system has been audited carefully for potential vulnerabilities.
  • Data transfer
    In most cases the data transfer from your home to the offsite location will be using an insecure IP protocol across a variety of ISPs (Internet Service Providers). It may be helpful to perform a traceroute to identify the typical path your transfers will take. The more hops that your route takes, the more you expose yourself to the possibility of traversing an unmanaged switch (which could potentially allow someone to sniff your backup transfer packets). As I encrypt all of my backup jobs, I have no concerns about anyone capturing all of these -- they can't decrypt them.

You must Encrypt your Data!

In light of the above, a home-brew online backup mechanism will be inherently less secure than one you pay someone else to do for you. Therefore, you should expect that hacks will happen, and that it is possible (although highly unlikely) that someone could download your backup jobs.

Since I have encrypted all of my backup data with an industry standard AES encryption, these data files will be virtually useless to anyone. Note that you will be reliant on encryption, not just password protection.

As an example, I examined the backup data from one of my password-protected (not encrypted) backup utilites, and it turned out that the data was stored in the backup file plain-text! Only a simple flag and 256-bit hash of my password was all that differentiated my password-protected job from a non-protected job! It would take me less than a minute to remove the password protection from any backup job created by this program!

When encrypting your data, you should select a strong password (mix of letters and numbers) with a length that allows the key to be more secure. For example, to get the most strength out of 128-bit encryption, you should provide a passphrase that is 16 characters or more long (8 bits per character).

Other Concerns

Because a hacker may not be able to do anything useful with your backup data, they may look for other nefarious things to do. It is possible that a remote hacker could use your FTP server for their own purposes. Therefore, it may be worth monitoring your server logs occasionally just to ensure that other users are not taking advantage of your storage device.

Many backup utilities and NAS devices can automatically email you in the case of problems. If you are going to use such a mechanism, then it is worth enabling SSL or other protection for your SMTP (mail server) access, if possible.

Secure FTP

FTP data transfer is not encrypted. Some NAS devices support SFTP (Secure FTP), which encrypts the data transfer between your computer's backup software and the remote storage device. This is an excellent feature if your setup provides it. While most backup software supports SFTP, many affordable NAS devices don't. Unfortunately, the processor requirements to support SFTP in the NAS box makes it a less common feature.

Setting up your remote System

Once you have performed your full backup locally (an upcoming article will show a comparison of backup software), you can transport your drive to the new location and configure it for online use:

  • Connect the NAS to the Router
    Simply plug in your NAS box to one of the router's ethernet LAN ports
  • Assign a static IP address to the NAS
    In your NAS device configuration options, give it a static local IP address such as
  • Add FTP port 21

    Connect to your router's administration page (e.g. Open up the Port Forwarding options and set the IP port 21 to map to your NAS device's IP address. With my Linksys router, this option can be found under Applications & Gaming -> Port Range Forward . Set the start and end port to 21, protocol to TCP and the IP address to the static IP address you assigned above. Make the sure Enable checkbox is set and click on Apply Changes. If the router already has a computer with a FTP port active, then you may have to use UPnP Forwarding options to remap the port number.

Dynamic IP Addresses

One potential problem you may encounter when trying to put your FTP server in a friend's house is that their internet service provider may only grant them a dynamic IP address. In the ideal case, the router that you are connecting your NAS to would have a static IP address (one that doesn't changes). Static IP addresses are a lot more expensive to run, so most people will not have any guarantees that the IP address of their router will always stay the same.

What does this mean to you?

In order to connect to your NAS box behind your friend's router, you will need to type in their router's WAN IP address into your backup software's FTP settings section. Any time that their ISP (Internet Service Provider) decides to change the IP address, you will have to determine what the new IP address is, and then update your backup software settings.

Fortunately, it has been my experience that dynamic IP addresses rarely change (perhaps once or two per year), so this is not much of a concern when using the local ISP.

If your ISP changes the dynamic IP address more regularly, then you may want to consider using the free DynDNS (dynamic DNS) service. Otherwise, you may need to ask your friend to read out their IP address shown when they surf to

How to Workaround Dynamic IP Addresses Changing

Thankfully the setup I have been using rarely has its dynamic IP address changed. However, if did end up changing frequently, I was intending to implement one of the following automated workarounds:

  • Use a free service (such as or that gives you a "permanent" named address for your site, in addition to an Update Client (that is responsible for tracking changes to your IP address). This is the method I am currently using, with great success. This is by far the best approach to use.
  • Modify NAS box to ping a page on your webserver periodically. Accessing a .php page via the GET protocol will allow the PHP script to extract the IP address using the $_SERVER["remote_addr"] variable. This PHP script then saves the address in a location that I can then use to modify the backup software/script. As I decided against hacking the LINUX install on my NAS box, I decided against this method.
  • Install a "call-home" script on another computer that resides in this other location, which again accesses a web page on my web server. This type of script can be performed very easily with a typical batch job. For example, you can simply execute get <Website URL>, where <Website URL> is the location of your web server and script! Again, have the PHP script record the IP address (or email it to you) and you're done!

Of course, if you don't have your own webserver, you can instead take advantage of the scraping-friendly page! Simply issue a get from your remote computer and you will be returned the IP address of your remote network in plain text.

Online Backup Services

If something I have said above scared you off from hosting your own do-it-yourself offline backup facility, then you may want to consider handing over your cash to an online backup service provider. These services will give you peace of mind in exchange for a monthly bill.


  • Guarantees in uptime
  • Redundant protection
  • Some providers have varying levels of rollback capability
  • Security


  • Extremely expensive
  • No benefit of onsite full backup (if doing a large backup)

A few example providers

Articles on online backup providers


Reader's Comments:

Please leave your comments or suggestions below!
 I am considering the addition of cloud backup to my digital images backup strategy now that prices have been significantly reduced (see recent Google offer).
Has anyone implemented this already?
I would love to see comments and suggestions.
 This info is real helpful and i wanted to know where can i find a step by step doc to totally set all this up?

 Great article.

Just a question - putting your NAS to a friends house, as solution to theft.. - Doesnt he face the same risc of theft?

Wouldnt you be interested in writing about a little twist to your solution: Buy a (2-bay) NAS, and have a friend buy another similar. Then have your data in ONE disk on your own NAS, and the mirror disk in your friends NAS. - And vice verca.
Would that be completely resistant to flood, theft, disk crash?
 Interesting question... while it is true that the NAS at a friend's house could get stolen, this is just a backup. Therefore, my friend will let me know it's gone and I'll simply buy another drive again and copy the content from my original drive directly -- no data loss. As a result, I don't think there is as much value in splitting the NAS between the two sites.
 Make sure your FTP password isn't used for anything else, at all. Sending a password over FTP almost defeats the purpose of having a password. On the other hand, you don't want the password to be something that a rogue script is going to guess and turn your NAS box into a large drop-box used by some Eastern European criminal gang to store, trade, and sell their cache of ill-gotten meatloaf recipes.

Several people have mentioned compressed data transfer. This is presumably using zlib deflate (LZ77 with Huffman coding) or LZW compression. Neither one will do much at all (and will actually likely increase the size of the data transfer by a fraction of a percent) if you're entirely sending jpegs, mp3s, docx documents, xlsx spreadsheets, ODF documents, or pre-compressed zip archives. If you're sending uncompressed computer programs, you'll see a lot of benefit, but hopefully your backup software compresses its archives. Using compressed transfer will likely just use up a lot of CPU time and electricity on the sending side, but very little on the receiving side.

It's possible that rsync has a mode of operation where it doesn't run any software on the destination server. However, the big advantage of rsync is that in normal operation, it runs rsync on the destination machine to checksum blocks of files and send across only those blocks that have changed, rather than sending the entire file if the file has changed. Without a shell account, running rsync daemon, or other way to checksum the file without copying everything back, you aren't going to see the bandwidth savings.

Even over a VPN, I'd have rsync use an ssh tunnel. Even with the overhead of double encription (VPN and ssh), both computers should be able to pump data faster than your connection can handle.

Password protection on zip files is probably pretty poor. The old scheme from PKZip is weak. There's a newer protection scheme based on AES, but unless the software vendor mentions it's the newer AES scheme, their protection is probably the weak PKZip scheme in order to have the widest compatibility.

As far as encrypting the backups themselves, using SFTP and not encrypting the data actually on the drives would be ideal. I'm a crypto geek. I took Rivest's security class. My old room mate founded Penango. and I may have introduced him to encrypted email. I like encryption, and yet the risk is just too high of forgetting the password or your decryption program not running on any available version of Windows (in the future, when you're recovering). That being said, if FTP is your only option, I'd use GnuPG, a free open-source encryption program. GnuPG will allow you to password protect your files, and also allow you to use public key encryption. You create a key pair, your wife creates a key pair, and you come up with a good password to password protect the archives. Any one of the three methods can then be used to decrypt the archives. Write down your password on a piece of paper and store it in a safe place. If you're worried about industrial spies or rogue employees, writing down your password is a terrible idea, but if your main concern is random strangers on the Internet who have absolutely no interest in invading your home, writing down your password for your backups is a great idea. As an added measure, write down all but the last 3 characters of your password and tape that piece of paper to the bottom of the NAS box. It'll take a computer very little time to grind through the half-million or so combinations for the last three characters if you forget, but leaving off the last 3 characters will help your friend keep his/her curiosity at bay.
 Excellent article. Would like to add a couple of tips.
About the dynamic ip. I have a linkstation live and have registered the free dns service with the linkstation takes care of updating the dynamic dns with
I also use Cobian backup which is free and does volume shadow copies ( open files ).
It can compress the backed-up files into zips, with password protection if nrrded and it supports uploadind using ftp and scheduling.
it can also run as a service, so no need to be logged in.
wish it had rsync too...
been using different versions for years and never let me down.
 Great input... thanks for sharing, Angelo!
 Thank you to the Author (Calvin?) for this article, and the posters too. Actually I just wanted to remark, this is one of the better diy articles I have run across yet (here in Jan 2010). We need more like this.
To the author... good idea to always post the Author and Date of the Article at the top for stumblers (so we have a clue coming out of the gate ;^).

Thanks again.
 Hey, thanks! Yes, I wrote this article and have been using this approach for a couple years with great success. - Calvin
 I have a webhost with about 10 GB of unused space, toward which I have FTP access (I use filezilla). Seems like I should be able to rsync to that, right? Any idea how I'd do that?

ps - I'm quite the sophomore (wise-fool) ... I know enough to be dangerous, and do a lot of things that seem high-tech, but really I'm a dunce when it comes to code or scripting languages of any kind so please speak slowly.
 You should be able to do this, provided you are able to run rsync on both your webserver and your home computer. I have not tried using rsync, but may consider giving it a try.

Hopefully others who have used rsync can provide their input. In the meantime, have a look at the rsync tutorial page.
 I ran across this site and was wondering what you think of it.
 Seems like a very reasonable alternative if one isn't afraid to put some time into the setup. I anticipate that the casual computer user might be scared off by the underlying LINUX OS, but in most cases this is only really apparent in the install phase. The FreeNAS feature set looks decent and frequently updated -- so that's a definite advantage. Thanks for sharing this!
 thanks - I really enjoyed your article - i would like to definitely do this - what software are you using for backup - which one would you recommend
 While every backup software I've tested has its own drawbacks, I have been reasonably pleased with Backup4All (have used it for over a year). Have a look at my backup software comparison page.
 In regards to rsync. I use that here to backup two offices to my home server. I also use deltacopy, but I have VPN routers between all three sites. Encryption is not needed (or is it?). My routers are also completely locked down, only mail (using non popular ports) and web browsing is allowed. Just my two cents...

Actually what I do is rsync to a local machine (lan) at each site, then rsync to my home server everything, so I have a local copy (in case of hardware issue) of the data and a remote(in case of a site issue). Then backup at home to tape
 Thanks for sharing your setup, Brian. It sounds like you have it well covered! If you have VPN tunnels set up between each site (and you are using the VPN), then you shouldn't need any additional encryption. The only limitation I see is that your backups (at home) are not encrypted, so it's worth considering the nature of the data and whether it warrants being encrypted in case of theft.
2008-07-20Jeff Lansman
 There seems to be a lot of concern over dynamic IPs. You mention but then talk about writing scripts. No scripts are needed if you use DynDNS. Instead of having an IP address, DynDNS will assign a name to that IP address. An example might be "". You just need a client running on the friends computer (or better yet, in their router) that updates this for you. The client monitors the IP and if it sees it change it makes "" point to the new IP. If you use "" instead of an IP address you don't need any scripts at all, and you have a name that you can easily remember.
 Thanks Jeff -- Yes, certainly that method will work nicely. A script that I developed accomplished a similar result but without the need for DynDNS: the server monitors the IP address for changes and then informs the client software to update the IP address. However, DynDNS is a more transparent solution as it only really affects the server operation requirements.
2008-05-12Daniel Britton
 The solutions above all sounds quite complicated. I have just started using an online backup service from Reason for this is I get a 150GB backup account for 4.95 per month ( I actually backup over 200GB after compression) where I know all my data will be safe regardless of disaster and their software will also run a local backup to my USB drive at the same time. This way I automatically backup all my data offsite and onsite. The other good thing is I can send them my data on a disk and they transfer it to their data centres, so I only ever run incremental backups and there is no need for the initial huge upload.
 Thanks Daniel -- Seems like a very reasonable option. The fact that they'll accept an initial full transfer by mail (although I didn't see it mentioned on the website) is great, as this is a big problem I see with many online backup providers. Doing a local backup at the same time is a great idea to help increase the immediate availability of the files too.

 Surprised it's not mentioned here, but using a [free] dynamic DNS service would take care of the DHCP modem address.
 For sure -- I did mention DynDNS in the paragraph under "Dynamic IP Addresses". However, I have now developed a small new tool that I will release soon -- it will notify you anytime your remote IP address changes so that you can update your backup destinations.
 To get around the dynamic IP problem it may be possible to develop a little utilities that run in the background on both the NAS server and the server being backed up to send/receive the dynamic IP via email just prior to each backup.
 Very true. I have experimented with this type of automation in the past and came up with some approaches that could work reasonably well. I added a section to the above, entitled "How to Workaround Dynamic IP Addresses Changing". As I realized, having a script run on the NAS server itself wasn't an option for my installation, but it may be for others. Instead, using scripts running on another computer on the same LAN as the NAS provided this functionality quite easily.
 Good article and the comments are worth reading through as well. For a free NAS, one may want to consider NASLite... The base version is free, supports FTP for data transfer and telnet for status. Other versions have web base interfaces, but of course, they cost $$$s.

I've only used this setup locally, but, I think the standard port forwarding and redirects should make the NAS available across a WAN.
As always, YMMV.
 Thanks Bill -- I've had a quick look and it does look promising. If you have a spare (obsolete) computer lying around, this would appear to be a great way to put it to use. Thx for sharing.
2007-04-03Jos Schaars
 When using a LinkStation Pro (LS) as a remote backup server, you have a much better alternative using the standard enabled rsync daemon of the LS:
  • 1. Set the Backup Disk option of the backup destination in Shared Folders of the UI of the LS.
  • 2. Forward port 873 in the router to the LS.
  • 3. Download DeltaCopy (Windows) from and set it up to backup to the LS. You can let it send you an email each time the backup is run.
You wll be amazed how little bandwidth/time you will need to make a daily backup (after a first, preferably local, full backup is made).

If you want more control over the backup server or even want to backup two LS's to each other:
  • 1. Patch the LS's firmware with firmware version 1.03-0.51-jtymod5
  • 2. Clear the root password
  • 3. Use for instance Putty to login as root in the Linux operation system of the LS.
  • 4. Change the file rsyncd.conf in /etc to:
    uid = root
    gid = root
    use chroot = yes
    [public name/alias backupdirectory]
    path = /mnt/disk1/backupdirectory/.
    read only = no
    host allow = ip-address client
With a small script on the LS client I use it to do an automated daily and weekly backup between two LS's, sending a weekly report per email.

Thank you Jos for sharing this. Using rsync may be a far better solution in many ways. rsync only requires the client (your PC) to send segments of the files that have changed, not the entire files. So, the incremental backup process is far more efficient. If you have large files that need backing up, this could potentially be a huge timesaver.

The only tradeoff is that the client-side software (DeltaCopy) doesn't offer the same functionality as my current PC software solution. I'm interested in preserving rollback of backed up files, for example -- so it may be necessary to perform this type of task as a separate local script first, before launching DeltaCopy for rsync.

UPDATE 04/05/07: Thinking about this further, I don't think that an rsync approach would actually save me any transfer time as I am always encrypting (and compressing) my data before transmission. Performing either compression or encryption on the data will mean that the byte-level file differential / incremental transfers will not work (as the entire file will change). But, for those who are able to do backups in a more secure environment / LAN (where encryption is not necessary), this may still work very well.

I have considered installing my own firmware so that I could add some further functionality to the Linkstation, but wasn't aware of a suitable firmware build until now.

Thanks Jos!
2007-02-12Susan Carley Oliver
 Excellent article. Thank you for taking the time to put this all together. Regarding security against hackers - isn't it possible to tell the NAS to accept incoming traffic only from a specific IP, and wouldn't that guard against interlopers?
 Glad you found the article useful! As for setting up the NAS to accept only accesses from a specific IP -- I wasn't able to find a way to configure this with my particular Linksys router. Some routers may allow you to specify an allow list from the WAN (like it does for the local LAN computers). If you have such a feature, this could work very well.

Unfortunately, if you are doing this on a budget, you are probably going to be using a dynamic IP address assigned by your ISP. In this case there is a chance that periodically they will change your IP address, effectively barring you from communication to your router / NAS. For the record, my dyanmic IP address has not been changed on me in several months.


Leave a comment or suggestion for this page:

(Never Shown - Optional)