How would you like to have unlimited, free online backup for the rest of your life? What we’re about to show you is a “Do It Yourself” system that’s commonly used by people that want to cut corners and save money on their data protection.
In this step-by-step tutorial, we’ll show you exactly how to create your own “offsite backup” software from scratch – in about 3 minutes. All you’ll need is a remote FTP account – including username, password, and network address. That’s about as technically complicated as this tutorial is going to get.
If you’re technically savvy, you can create your own FTP server for free using old computers and surplus components. Otherwise, there are many other low-cost alternatives to get your own FTP account or server.
But first, we need to issue this warning: Storagepipe does not endorse this methodology. We’re only showing you this for educational purposes. After the tutorial, we’ll go back and discuss why you should never protect your data in this way.
In this session, I will be showing you how you can use Windows to back up a local folder to a remote server using FTP. I am using Windows 7 here, but any version of Windows will give the same result.
Please go ahead and open Notepad, then start typing your FTP information. Start by typing your user name. The way to do it would be to type in “user” then space, followed by your username. Go to the second line and type in your password. On the third line, please type in “prompt” and on the fourth line specify where on the remote server you would like the files to be uploaded to. So, you say cd/ and then your-director. Then type in “bin” and “mput” space, followed by where your backup folder on your local computer is. Let’s say you want to upload a folder on drive C. You will type something like this: “mput C:your-folder” and then make sure to put star (*) so it selects all the files within this folder. Lastly, you should go ahead and type “quit.” That is our script so far. Please go ahead and save this file and give it a name, any name; I am going to call it myScript. Make sure you give it an extension of .dat. Before you go ahead and save this file, make sure you save from the “Save as type,” select “All Files,” and hit “Save.” Now that we have these parameters for our FTP information and our local backup folder, we will need to run a process which takes place every day to upload by FTP using these credentials to remote servers. For this reason, we will be using task scheduler. Go to “Start,” “Task Scheduler.” Expand this. Right click on “Task Scheduler Library,” and “Create a Basic Task.” Give your task a name. If you want to go ahead and give it more description, you can go do so in the Description box below. Now hit “Next.” In this window, you will need to specify how often your script is going to run. It can be anytime — weekly, monthly, daily — and in our case we will be selecting “Daily.” You can also specify a time when your script is going to be running every day, so I am going to select 8:00 AM. Hit “Next,” and then make sure to select “Start a program.” Hit “Next” again and you are going to be prompted with this screen where you should input your program that should launch when this event occurs every day. In our case, we will need to run this batch script via command prompt. To do this, we will need to launch — we will need to specify the program as powershell, and we will need to pass in arguments. Let’s do so. In the Program text box, please type in “powershell.” Then, next to “Add arguments,” please go ahead and type in “FTP, space, -n, space, -s, colon (FTP -n -s:). Now, you will need to reference where your local file that you just saved, that batch script, here. It could be anything, but I have another copy of this same file which goes in drive D, so I will say D backslash (D\). I have saved my file as FolderBackup.dat. Next, hit space and specify your IP address or domain where the files are going to be uploaded into. Let’s go ahead and say “storagepipe.com.” Hit “Next” and “Finish.” After this has been created, you can locate it somewhere in these tasks and you can just click anywhere and start typing the task name. After locating it, make sure that you go to “Properties.” As you can see, there is something called “Run with highest privileges.” You will need to ensure this is selected and hit “OK.” I have here FTP client files that will open to the same directory where I am supposed to upload my files. As you can see, there are no files at this moment. I have created an identical task already, configured with my own local files — backup folder — and my own FTP server. So, I am just going to go ahead and test that this is going to be working. For this purpose, I will be selecting the task that I need to run right now because it runs every single day at 8:00 AM. To prove this is working, I am just going to select it, right click, and click “Run.” When you do this, as you can see, the command prompt will be open with the file upload process taking place. When it is done, it is just going to quit the script. If you go to your FTP client and “Refresh” the folder, you should see the files have been uploaded. That is all you have for now. If you have any questions, please let me know, and have a good day.
If this all sounds too good to be true, then it probably is. As mentioned at the beginning of this video, this is a horrible way to back up your data. There are many important reasons why, but it would take hours to list them all. So instead, I’d like to just highlight a few of the most important problems. 1) You have no way of knowing if the files were successfully transferred. We commonly see situations where companies have gone years without a successful backup, because they simply assumed that their backups were working properly, without any monitoring or verification. 2) You’re only creating a single backup version. Every time you back up, you’re overwriting the latest remote copy with an identical copy of what’s on your hard drive. If a virus corrupts all of the files on your local machine, your backup data will be overwritten with this corrupted data. Now you have 2 sets of corrupted files, and no valid backup copies. A proper backup solution should provide you with the ability to roll files back to a previous point in time. 3) This is an unsecure solution. Not only is the data being transmitted over an unsecure protocol, but the remote backup copy is also kept unencrypted. Theft of poorly protected, unencrypted backup media is one of the leading causes of data leaks and security breaches. 4) With this solution, you’re only keeping a single copy of your backups. If your remote FTP server crashes, you’ll be unable to back up your data. And there’s no system for alerting you that your backups are no longer taking place. This is especially true for people who use certain varieties of low-end, consumer grade Networked Attached Storage devices, which are notorious for breaking down. 5) Depending on your upload speeds and the size of your data, it could take days, or even weeks to transfer all of your files over. And if your FTP server is kept at a remote location (as it should be) then you might end up incurring overage fees and penalties for exceeding your ISPs transfer limits. There are hundreds of other reasons why this DIY approach is a terrible way of backing up your data. But despite these warnings, this method — and other similar approaches – continue to be commonly-used methods for ensuring automated off-site data protection. Yes, it’s cheap. Yes, it’s convenient. No, it won’t protect you. And for some people, 2 out of 3 is just fine for their needs. But, if you’re serious about protecting your data through off-site backup, you should look for a serious, business-grade solution that ensures your data remains properly protected at all times. At a very high level, here are a few things you should look for:
- Data is stored in secure faculties
- Backups are monitored for consistency, and technicians are there 24/7 to fix any problems.
- Data is transferred securely over SSL, and remains encrypted while in remote storage.
- Backups maintain many historical file versions, so that corrupted files can be rolled back to previous uncorrupted recovery points.
- Only changed files are transferred, while ignoring files that have not changed since the previous backup cycle. This incremental approach ensures faster backups and more efficient bandwidth usage.
- At least 2 backup copies are kept, so that you have backups of your backups.
- Backups should be stored far away from the primary computers, ensuring that they are not destroyed in the event of theft or disaster.
But most importantly, you need a backup plan that’s custom-tailored to your needs. Talk to a backup professional, and implement a plan that’s efficient, automated and leaves you truly & completely protected. A single critical data loss incident is all it would take to shut down a business forever. Never cut corners when it comes to the security of your priceless and irreplaceable digital assets.