Computer Backup

Computer backup has been discussed a few times in the past, and since it is quite a deep topic with a number of solutions - many of which are DIYable, I thought I would start a wiki page:

formatting link
This is just an outline at the moment, that hopefully more detailed and practical stuff can be hung onto it as it develops.

So suggestions for what it should include and where it should lead would be welcome.

It could link on to articles on building out DIY NAS solutions, or integrating cloud storage, or software automation suggestions and solutions.

It could have machine / OS specific sections on things like using Time Machine or Windows Backup.

Sections on drive cloning with Macrium reflect, or Acronis etc.

It could include links and recommendations to products (or combinations of) that have worked well for you etc.

What are your thoughts?

Reply to
John Rumm
Loading thread data ...

I don't like storing my stuff in the cloud. No privacy there. It can be hacked.

Cloning the drive every time you back up is too tedious.

Personally I try to use the "portable" versions of software if I can find them. Portable versions of software are not registered in the Windows' registry. The whole thing is self-contained in a folder and it can run right away when copied to another computer.

For example, I use Firefox Portable, Google Chrome portable, Microsoft Edge Portable for browsers:

formatting link
I use Thunderbird Portable for email:

formatting link
There are all sorts of portable programs in my computer, too many for me to list them all here.

For backups, I use an external 2.5 inch USB 3 hard drive.

The backup software I use is called "SyncFolders" Portable:

formatting link

Reply to
invalid unparseable

I just use Drive Snapshot, which saves the partition or drive to a compressed, encrypted file. One can also open the encrypted file to copy any individual data over. I have used it for years and is quite robust. I have copied an encrypted file over to other computers and run the Operating System quite successfully from there.

Reply to
jon

Very interesting project could cover:

Should I keep my data on a separate drive/partition to the OS? Pros and cons of above. What software can I use, mine takes parameters so runs from a batch file. How to check it it worked. A grab it and run drive. Offsite storage pros & cons.

Experience of what happens if you don't have a backup. I lost a file recently that was part of a program I was writing and had a lot of work put into ti. Never did discover why but I found a copy on my grab it and run drive that saved the day.

Reply to
Jeff Gaines

Time Machine is easy to use. It can be configured to back up to more than one device on a rotating basis. You can tell it to exclude files or folders from being backed up. All this is trivially configurable from System Preferences. Backups are incremental unless it's the first backup.

It backs up every hour and keeps the last hourlies for 24 hours, the last dailies for a month, and then weeklies for as far back as it has space.

For restores, if you go into TM then it presents you with a simple interface to select a particular backup. All backups are presented as if they are full backups - it does this by using hard links in each incremental, to point to the last backup of files that were not backed up in that particular incremental. The interface looks just like the Finder and so you navigate through it in a familiar way to find the files/folders you want to restore. In restoring, you can either restore to a location of your choice (e.g. the Desktop) or have the restored items replace the originals (I never do this). I believe there is a CLI interface too, which also allows adjustment of parameters such as the backup frequency, but I've never used it.

tl:dr; - easy to setup, use, and restore. When I want to restore something it's usually urgent and I don't have time to fart about with a CLI in order to do so. Very occasionally TM gets its knickers in a twist and backups disappear.

AIUI, newer versions (on APFS volumes) work (IMO) less well: they have this snapshot concept which seems to mean that all backups are full backups. How this can be better beats me. However I've had no exposure to that.

Reply to
Tim Streater

Well lots of companies with your data are already doing it. Certainly we are. If you don't trust the encryption schemes there's not much anyone can do for you.

Personally I use an outfit called iDrive. At work I've split it between Backblaze and AWS-S3 deep glacier. The latter is more an archive as it takes up to 3 days to recover. But it is *very* cheap.

Anything in the same building isn't really a backup. There can be plenty of situations where you can't physically access the media and your network is down. A fire for example.

There are 2 elements to backup. The machine. And the data. Again at work, we have whole machine backups. However we can also recreate machines from puppet and then just load the data from the last backup. Having timed that at about 2 hours from bare metal, it makes running a hot standby at £ £/month a non starter.

Reply to
Jethro_uk

But what do you do if you want to get a several weeks old version of a file? Personally I've found over the years that I have much more frequently needed to restore a file that I either deleted or messed up in some other way rather than restoring a whole system after something has broken. Computers are much more reliable than me! :-)

I don't save clones of great chunks of my system. I have a warm[ish] backup system that can take over the major functions if my main desktop/server dies completely. What I *do* have is two sets of incremental backups of all important files, one set is on the main server and does hourly/daily/weekly backups back for a month. Then there is a remote backup machine on which I keep long-term backups, daily/monthly/yearly which go back several years. It's surprising how often I find going back to get a file several months old can be.

If a major system does blow up then I'll create a new system and restore my personal files and configuration from the incremental backups.

Reply to
Chris Green

For the benefit of other readers .... Wondered why I hadn't heard of Time Machine - it's built in software of a MAC

Reply to
wasbit

Probably the best article I've read on backing up. A lot of work went into that.

FYI. The first paragraph Terminology/What is it reads: " that if something goes wrong, like you hard drives just fails without warning," I would suggest - like your hard drive fails without warning

Thanks.

Reply to
wasbit

Doesn't matter if you are backing up by directory rather than by partition

Depends on the OS and what you are trying to achieve, Mine runs linux rsync from a script

IME the major cause of data *loss* has been drive corruption or drive failure. Ergo in a domestic (as opposed to commercial) situation a separate *drive* for backup is advisable. Total machine loss due to theft or fire is very unlikely, and in that case data loss is probably the least of your worries, but you could use a USB portable drive to occasionally copy your backup disk onto. I am afraid that having run early 'cloud' type setups, I wouldn't trust any of them not to have a sysadmin that reads the data, or has written decryption, or in fact whose decryption get borked and you lose all the data anyway. And, you local bandwith to dump data over the net it is likely to be way faster than you can access the cloud. My final comment, is that the cloud is for largely for people who are too technically inept to do it fir themselves.

The fact that that includes major banks etc., is food for thought

That is another thing a daily backup is handy for. If you accidentally delet something, last nights version is still on your backup

Reply to
The Natural Philosopher

ok ta, fixed...

(the number of times my brain said your, and my fingers type you was quite surprising - I found half a dozen before even posting the link!)

Reply to
John Rumm

Some minor comments:

- cloud backups and privacy. Use encryption.

- if the hard drive fails, you need backups. But I mirror *all* my hard drives (Windows and FreeBSD) so if a drive fails, I take it out of the mirror, replace it with a new one and re-establish the mirror. This is not backup, but means that downtime is very small, basically to replace the drive. That takes about two minutes max on any of the 11 machines I run; faster on some. This is NOT backup, but saves a lot of hassle.

- think about other failure modes; power surges for example(I use smart UPS units to minimise the risk).

- what if the house burns down? Are all your backups in there? Copying to an external drive (well, I hope there is more than one in the cycle) is all very well, but if that is burned too, all bets are off. I keep cloud copies, and physical copies in a storage unit ten miles away on high ground.

I use:

- simple copies of a FreeBSD backup (Windows home directories are on a FreeBSD server)

- vital files on tarsnap (only applicable to UNIX-like systems), which is cheap but costs enough not to do everything. It is incremental in nature, but with deduplication. I keep every day for a month, every month for a year, every year indefinitely.

- every 16 weeks, a complete copy of it all on DVD. Two copies, one and home and one in storage. I once discovered I had screwed up a file 8 years previously, and this saved me.

- AWS Deep Archive, as previously noted.

Reply to
Bob Eager

May be worth discussing. I have no data on the "C:" drive, it's OS only so I have nothing to back up on it* and if I have issues I can do a fresh install of C without losing data. There are some other points though, particularly with permissions.

  • Exception is things like Brave which keeps bookmarks there.
Reply to
Jeff Gaines

I've been using ZFS at home as my main fileserver since it was first available internally to Sun's engineers. As I moved more to laptops, that system has moved more to being my backup server, but still have live data on it too. It contains daily snapshots of all my filesystems going back to March 2007, just because that's so easy to do with ZFS, and it has occasionally been useful. Every so often, I mirror it all off-site.

I now mainly work on Macbook Pros, and I run ZFS on them. They get regularly snapshot, and the snapshots sent to the backup server. I have done some full restores from backups when the SSD died in one laptop, and when I bought a new laptop. Individual files can be restored from the local snapshots, unless I needed to go back more than a year in which case those snapshots are only retained on the backup server (but can still access all the individual files in those snapshots). There are some similarities to Timemachine, but ZFS backups are restores are so much faster, and there's none of the time wasted working out what needs to be sent - ZFS knows that instantly.

My x86 backup server is ancient - a 2007 PC (although disks, PSU, motherboard caps, memory, fans, etc have been replaced over the years), and too power hungry given it's on all the time and the current price of electricity. I have just started building a replacement using a Raspberry Pi 4 and SSDs which is working well, not quite as fast as the PC, but still plenty fast enough. The new Pi 5 would be better (particularly given it has hardware crypto and some of my filesystems are encrypted, which is quite CPU intensive on the Pi 4) and I'll probably switch over, but that's a low priority currently.

Both old and new systems use ZFS, and verify the full data integrity weekly (zpool scrub). That's spotted two instances of silent data corruption (disks starting to return incorrect data) since 2007, and automatically fixed it (disks are mirrored). This prevents bitrot going unnoticed in rarely/never used files.

I've worked for a bank managing around 20,000 disks around the world all running ZFS. We had some really bad hardware failures (such as too many disks failing in RAID6 (RAIDZ2) arrays and terrible operator errors replacing wrong disks, but we never lost a single ZFS filesystem. The worst incident we had in a filesystem with around 4 billion files and too many failed disks for a classic RAID array to survive was we lost about 1000 files, and ZFS tells us exactly which files, so we could just copy them back, rather than restoring the full 4 billion files. I couldn't imagine myself ever using anything else, but obviously it's a learning curve for anyone not familiar.

Reply to
Andrew Gabriel

Surely the really biggest cause is human error, it certainly is in my case! :-)

Reply to
Chris Green

No mention of grandfather - father - son backups, so there's always three copies available in case a single backup becomes non-restorable? And of using the oldest (grandfather) to be overwritten with the latest?

What about the possibility of "backing up" the OS separately from the data? Linux Mint can do something like that by using "Timeshift". It makes a snapshot copy of the OS (and not the data unless you tell it to include that). Then if you install a program or do something which borks the system, you can get Timeshift to reset the OS to how it was before you made the changes.

Might also be worth a mention that, for example, good quality USB sticks should be used for backups.

Reply to
Jeff Layman

Indeed... However it is possible to access Time Machine backup data under windows, if you give it the capability to mount a HFS+ file system (some free, some paid for). Which allows for data recovery even if you don't have another Mac to hand.

(MS have evolved their own backup programs over the years such that the most recent versions are actually quite time machine like. My main grumble with MS backup offerings is there have been plenty of times that backups generated on earlier versions are not restorable using later ones)

Reply to
John Rumm

Nice :-)

You fancy adding a section on ZFS*, its use, config etc, and what benefits it brings?

(*not something I have used)

Reply to
John Rumm

And if you enter %appdata% into the address bar of a window, where does it take you?

By default it will be in c:\users\<user>\AppData\Roaming\ However if you move your home directory, that will usually go with it.

It's one to watch for since things like Thunderbird will store all its files in the %appdata%\Thunderbird folder. Office might keep local customised templates and stationery there, Sticky notes will save its date there and so on.

Reply to
John Rumm

The Windows 7 Backup in the Control Panels on modern Windows, already uses containers for backup storage. The latest version uses .vhdx containers (>2TB capability for the container) while around the Windows 7 era the container was .vhd (<2TB size limit).

Even disk2vhd by Russinovich of Sysinternals, converts a disk to containers with only the active files taking up space.

In fact, lots of the 20-30 backup tools for Windows, use smart backups and standard or proprietary containers, with runtime mount capability.

What you're referring to then, is a "norm". It is standard operating practice. And the usage of encryption in your case, prevents further compression of the container, by "better compressors" such as 7ZIP. I compress old backups to make room for new backups, so the ability to fiddle the compression is important. When I make a backup, I tell the backup program to NOT compress the image, then I compress it with the better compressor later. You only apply encryption to things, when your best effort at compression is complete. Since encrypted materials... don't compress.

Paul

Reply to
Paul

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.