Backup / imaging software

Doesn't do the incremental he needs.

Reply to
Rod Speed
Loading thread data ...

You can see the files in it, because it is a .vhd or a .vhdx file per partition (from the aptly named "Windows 7 Backup").

I think I caught my Windows 11 Home being willing to mount a .vhdx file the other day. Several of the OSes will mount a .vhd and 7-ZIP file compressor will open a .vhd file and allow you to extract a file. Working with .vhdx files is harder.

I would agree with you, that the .vhdx file format is mostly a dead end (it used to depend on HyperV tools for mounting), but they may have relented at some point and added that format to the "mount" option in the Explorer menu. You can also use the Disk Management "Attach" option, which is also mounting. To use "Attach", left-click the leftmost box in any disk row in Disk Management, to make the "Attach" items move out of the greyed out state in their top-bar menu.

This is an example of a GPT partitioned disk and its Windows 7 Backup output. I've removed a number of small XML files from the listing.

Directory of F:\WindowsImageBackup\Petunia\Backup 2022-10-18 124040

10/18/2022 08:53 AM 14,114 22c90387-6ed1-405a-90da-4ec604845ff6_Components.xml ... 10/18/2022 08:53 AM 67,108,864 Esp.vhdx <=== EFI GPT Boot Info FAT32 10/18/2022 08:53 AM 43,634,393,088 7b1b05b6-42c7-428d-a0ca-4204a1df973e.vhdx <=== Full C: partition NTFS 10/18/2022 08:53 AM 524,288,000 cd183cf7-3c3f-4556-abe5-e911dfb0d8f7.vhdx <=== System Reserved (WinRE.wim) NTFS 10/18/2022 08:53 AM 1,528 BackupSpecs.xml

This is an MSDOS partitioned disk, so no ESP partition.

Directory of F:\WindowsImageBackup\WAFFLES\Backup 2017-12-10 075728

12/10/2017 413,138,944 00012035-0000-0000-0000-d0bf17000000.vhdx <=== System Reserved 12/10/2017 23,037,214,720 00012035-0000-0000-007e-000000000000.vhdx <=== C: 12/10/2017 1,276 BackupSpecs.xml

This one was generated in actual Windows 7, so it uses the .vhd format that has a 2.2TB max file size. 7ZIP will open these files. Windows XP will even mount a .vhd, but you need the tool Ben Armstrong wrote (VHDmount) at Microsoft to do the mount, as the command is not built into WinXP as such.

Backup 2011-10-17 011010

10/16/2011 10:13 PM 41,956,352 203917be-715a-11df-93fa-806e6f6e6963.vhd 10/16/2011 10:55 PM 26,680,273,408 203917bf-715a-11df-93fa-806e6f6e6963.vhd 10/16/2011 10:21 PM 1,186 BackupSpecs.xml

*******

The part of the backup system that generates a series of ZIP files on your external drive, you cannot take a victory lap when you see the ZIP file extension, because there is a tiny "trick" to recovery. Several sequential files have to be catted together, for "whole" files to come out of the process. At least, passing one file to WinZIP or similar, the tool will be unhappy with just a single file, and you have to "assemble" an archive and feed it to your tool for the thing to work.

*******

While the Microsoft various bits and bobs make a fascinating archeology study, they're hardly what you paid for. The integration could be better.

Paul

Reply to
Paul

This article is from six years ago, so the info could be dated (like, the decline in the Acronis now-cruftware offering). There was one version of Acronis that was very nice. Think of the old version as "peak Acronis".

formatting link
formatting link
Free Imaging Software ---------------------

AOMEI Backupper 1.6 Free incremental ? Easeus Todo Backup 6.5 Free incremental ? Paragon Backup & Recovery Free incremental ?

There are three products seemingly offering Free Incremental.

You'd have to test those, and see if it's just Trialware in disguise.

Paul

Reply to
Paul

rsync copies only the changes needed to keep the backup the same as the original. How is that not incremental?

John

Reply to
John Walliker

Announced 25th Nov

This is to notify that Macrium Reflect Free Edition is being retired. Security patches will be provided until 1st January 2024, but there are no planned feature changes or non security related updates following this update.

This only applies to the free version.

Reply to
alan_m

When doing a restore, you want to return the source machine to "as it was". This is the output of a Windows backup tool, where every day, the same 2GB of files are being edited and changed.

Monday Full 20GB Tuesday Incremental 1 2GB Wednesday Incremental 2 2GB Thursday Incremental 3 2GB Friday Incremental 4 2GB

Now, the source machine is wiped out (bad disk drive). And you decide that Wednesday was the last day that files were in good shape. You want to return to Wednesday.

Using a new disk drive, you

Install Full, then Incremental 1, then Incremental 2

And now the machine is as it was at the end of Wednesday.

If you use Rsync, you are maintaining a single backup at the destination end. When you go to restore, can you restore to Wednesday easily ? When your remote storage is now at "Friday" state ?

With real backup software, the remote machine has 28GB of files and can restore to any end date.

With the Rsync machine, the thing might contain around 22GB of files, and not all days are represented in the store. Just "Friday" is the store state at the moment, and the frequently changing files are the ones from "Friday".

You could make the point "well, I store a copy of the Rsync destination each day". OK, the remote machine has

20GB + 22GB + 22GB + 22GB + 22GB or so of files. Which is 110GB and a lot more than the traditional backup scheme with its 28GB of files. The OP wants economical storage, plus daily date of restoration ("Wednesday if I want").

This may not matter when the backups are small, but my 3TB backup here is going to blow out that drive on storage.

Using deduplication software or other means, you can likely manufacture a piece of software with the right semantics, but that's a lot of work when one of the free Windows ones will do it for nothing. Plus, the Windows ones will preserve the arcane Windows permissions. Windows is "full of things" intended to break software. Such as the Reparse Points that Linux does not know how to parse, and causes Linux to report "I/O Error" when you try. Try Rsyncing the System32 contents. Or the WinSXS contents. The WinSXS has items that are hardlinked, and it is considered a "software fail" if you unlink the files by accident with a method. (The hardlinks are part of the Microsoft maintenance strategy.)

Your method must be prepared to detect and handle hardlinked files. I've even had Microsoft software make that mistake! (Robocopy doesn't do the right thing all the time, and you need to double check what is going on. I tried to use my usual command and it screwed up the hardlinks.)

A Linux method for a Linux file system, is a lot safer.

Microsoft has converted NTFS into a minefield. It's full of dirty tricks. On one release, there was a section of filesystem turned into a namespace, and loaded with unprintable Unicode characters. I tried to paste that into Thunderbird to asks about it, and it could not even be successfully pasted (stuff would go missing). They removed that idea on the next release, as I expect it broke some of their tools as well.

Your "dd" idea is a lot safer :-) I can find little fault with the basic integrity of a dd backup. You know everything you want, is in there. And a backup.dd can be opened with

7-ZIP tool in Windows, if you need to extract a file from your Downloads. 7-ZIP GUI can examine any Windows partition in a .dd , without the hassle of traditional mounting. The experience with large containers, looks similar to this. Burrowing into a C: in an archive. I would have used a .dd for this but I don't have one handy. [Picture]

formatting link
Paul

Reply to
Paul

A link would have been nice but thanks for the info.

-

formatting link

- tinyurl.com/3vtj2aaj

Reply to
wasbit

I used to do this for a children's PC, before they switched to tablets for homework and social use. The backup target was a zvol (a pseudo raw disk slice under ZFS), which I snapshot after every dd, giving incrementals.

Recovering individual files tends to be more common than a bare metal restore (which I only ever did once). zvols (including any of the snapshots) can be exported back to the original PC as an iSCSI volume, allowing retrieval of individual files from any of the snapshot dates.

This required the backup system ran an OS supporting ZFS and iSCSI. At the time, I used Solaris, but there are now many OS's which support OpenZFS (including Windows, although not tried that).

That was 10 years ago. If I was doing this today, I would probably use a different solution, although it would still use ZFS in some form for the back-end, and possibly the primary storage.

Reply to
Andrew Gabriel

It is incremental in the sense that it only backs up stuff that has changed since last time. This makes it an efficient way of synchronising a mirrored copy of a set of files.

However synchronisation is not a proper incremental backup because it won't allow you to restore a previous version of files that have been already been changed locally and then resynced to the mirror.

Backup will let your recover not only a missing file but also one that you have access to, but realise was corrupted or altered in some way such that you want to go back to an earlier version.

Reply to
John Rumm

when I took over as Treasurer of a branch of a charity, I was concerned about backup of the accounts. My decision was to use a cloud based back up. That way, if the house burned down, the accounts were still safe.

Reply to
charles

Have you done a test-restore onto a brand new blank drive ? (You know, that drive you keep as a spare, on-site. You use that drive to test your bare-metal restore is working.)

I've learned the hard way, from more than one experience at work, that the restore does not always work. On a cloud system, you have to do at least one restore, to prove there actually is a backup, and that it has integrity.

Making the backup, is only half the story.

One guy at work, was backing up chip designs to tape. And all of the tapes ? When we tested them later, they were all blank, because the tape drive heads were filthy.

On my Macrium .mrimg files here, out of boredom, I would occasionally load one and "do a Verify". And bloody hell, I was doing something else, when a dialog pops up, saying the backup I'm testing, is corrupted. I load up a second one, do a Verify, and it's corrupted too. Reason ? Bad RAM on the machine being backed up. Memtest was not able to fault isolate to the nearest stick, so I had to replace all four sticks just to be sure. I wasn't being serious-minded there, it wasn't a "100% verify" thing, just screwing around, when I noticed that.

Paul

Reply to
Paul

If I have time when doing a full drive image backup I run sha256sum on the source and destination drives just to make sure I have a truly bit-exact copy.

John

Reply to
John Walliker

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.