I would add:
- Not *too* regularly - because you don't want to be in the position of
having backed up a compromised system so many times that the last
"Good" version has fallen off the end of the backup list.
- Learn the discipline of separating "System" from "Data" and put
data on a separate physical device or, at least, a separate
- Use an Imaging utility that allows you to browse the image
and copy files from it as if were just another drive.
This because, inevitably, you will not be perfect in your
practice of not keeping data on the System - and it will allow
you to recover once you realize the error of your ways.
- Keep a change log where you note whenever/whatever programs
have been installed/uninstalled... and any other system changes.
Then you can image the System only a few times - once when you know it
is "Good" ... and then whenever changes to the log accumulate past a
certain point - and take incremental backups of the "Data"
drive/partition as often as desired.
Probably way to complicated for the user that thinks they have a toaster
or a blender instead of a computer... but it seems to me like minimal
basic hygiene to me
Good advice. At work we only have 6 computers, but each one has all
their data files copied to another device "just in case" You can get a
UPB drive for a few bucks for a quick backup.
You can always beg, buy, borrow, or steal a new word processing program,
but you will never buy those individual files of your own.
The only advantage to keeping data on a separate partition is so IF
you need to do a "bare-metal" install of your OS all your data is
still safe. The only issue there is you STILL need to reinstall ALL of
My preference is to keep an image of the C: drive on the D: drive so
you can simply restore the drive to what was there - 100%. Keep a copy
on a separate drive so in case the hard drive itself fails
Better than an image is a live clone to an external drive, done on a
regular basis. (a bit of a PITA with the new-style GPT drives which
don't like to be cloned - - - )
My experience - with a teenager pounding on my PC couple hours a day
five days a week - has been that the main virtue of having the data in
it's own partition is that I can (and used to frequently) do a re-image
without even thinking twice.
Box starts acting funny? Don't even *think* what the problem might be:
just fire up the restore CD, kick it off, and get a cup of coffee....
Come back and all is well.
My SDD will probably cough up it's guts and die as soon as I post this,
but I have never, ever had a System drive failure. I have retired a
few System drives when my monitoring utility started complaining... but
have never had one fail in use.
Separating data from System is huge... and my experience has been that
physical drive failures are the least of the reasons why.
The updating takes a long time, so I just start it, last thing in the
day and let it run overnight.
Right now, perhaps thanks to Oren's prompting, I am finally creating a
Win7 slipstreamed DVD. The one I have now has SP1 but none of the
subsequent updates. The one I'm creating now will have about 200 updates
If I am moving tools to a new machine (not adding any "new" software
or "updating" anything) *and* I have all of the drivers for that machine
already downloaded, it takes me about 3 full *days* to build one of
my workstations. (I have three; and the applications on each are
But, at the end of that process (ordeal), the machine is "set for life"
(I don't add stuff to a workstation once it has been built and
At the end of each calendar year, I reevaluate my tools/equipment
and upgrade (out with the old, in with the new) as I deem appropriate.
I am *really* reluctant to replace/upgrade a workstation in that
process (typically, the machines are faster than my meatware so why
incur the cost and inconvenience of an upgrade unless there is some
HUGE advantage to doing so?). And, only mildly less so willing to
upgrade a piece of software ("for a NEW set of bugs!").
I chuckle at folks who are always buying the latest and greatest.
Unless they're playing video games, they're just throwing money
(and time!) away...
You sound like you know your stuff...
Question: has performance of PCs available on an common peasant's budget
progressed to the point where I would notice a diff if I upgraded from
my current GigaByte Z87X-UDH4 / i7-4770K @3.5 GHz with
16 gigs for ram, running from an SDD ?
Not really. I just spend a lot of time in front of machines actually
"doing real work" (not watching movies or browsing the web, etc. but,
rather, authoring multimedia presentations, designing circuit boards,
writing software, etc.). In each of these cases, *I* am the limiting
factor. A faster machine just means the machine spends more time WAITING
for me to figure out which key I want to press next!
I would seriously doubt it. Unless you're doing gaming, etc.
And, in that case, the video card is more important than the CPU
(video cards have GPU's on them that do most of the graphic
I use a 1.8GHz Core2 Duo for our HTPC -- and that probably has the
most demanding "real time" work to do (you don't want a movie to start
skipping, pixelating, etc. while watching).
There are many things that my workstations may spend *hours*
crunching (rendering 3-dimensional models, generating videos
of those models moving in a predefined "scene", compiling
thousands of files for a project I'm working on, etc.). But,
having a machine that is *10* times faster would cut that down
to *an* hour (or two) -- still too long for me to be sitting
there twiddling my thumbs!
So, I start a machine on one of those lengthy tasks and then
find something else to do while it is "busy thinking" (either
using another machine *or* on that machine WHILE it is
My current big bottleneck is speed of network fabric (1Gb)
and slow USB2 interfaces (I've not bothered to move to USB3 as it means
replacing a lot of hardware JUST to get faster disk interfaces).
When I started out in this business, I could make exactly *two* changes
to a project in an 8 hour "shift" -- 4 hours to see what the results
of my change would be! The machines were *so* slow...
As a result, I learned how to plan the use of my time around the
capabilities of the machines. As the tasks that I now do take considerably
more machine power than back then, it's just not possible to buy hardware
that is fast enough to ensure "no waiting" (at least not for the things
that *I* do).
So, I rely on the same sorts of skills I learned decades ago and
PLAN how my time will be spent so I'm not idle, waiting for a machine.
When I first started doing 3D CAD drawings on 25MHz 386's -- when THAT
was the "spare no expense" hardware available -- it would take a
machine 24-26 hours to render *one* drawing. If the lights blinked,
your heart sank! (Oh, crap! I wonder if the machine "saw" that??)
Now, I can render that same drawing in a minute or two. But, now
want that drawing to "move"... I want to see the mechanism operate,
see the shadows that are cast as it moves around relative to the
artificial light source, etc.
So, I'm back at the day-long renders :>
But, unlike back then, I don't have to move to another machine in order to
One of my granddaughters got a BS in computer graphics from Drexel and
now she makes a decent living doing same.
One of the companies she worked for as a student intern said she
"revolutionized the way they do business" for them - by installing
services on all of the employees' PCs that allowed offloading of
rendering tasks to those PCs - creating a multi-PC rendering farm so
productions could be rendered in minutes/hours instead of days.
Dunno how many PCs it was... but I would guess plenty...
Called a "network of workstations"/loosely-coupled cluster.
The amount of "surplus" computing power "being wasted" is staggering.
Ages ago, I recall walking by the stock room with an employer and
he glanced up, sighed and said, "Think of all that power sitting
there *wasted* (bare chips)..."
When I designed the home automation system, I was very aware of this
as a means of cutting cost 9at the expense of complexity!) by
leveraging "underutilized" computers that were *needed* in particular
places (in order to interface to particular devices) -- but that spent
99.9999% of their time twiddling their thumbs:
"Why not use that surplus capacity INSTEAD OF adding a big
computer someplace that throws off lots of heat, has noisey fans,
So, the design is far more complex. And, in many ways far less *efficient*.
But, if stuff was being "underutilized", then who cares about efficiency?!
Not very likely. Even a 25% improvement in processing speed at this
level is not something that will 'jump out at you".
The difference between 1/2 the blink of an eye and 1/4 of the blink of
an eye is not noticeable. (and that's a 100% improvement!)
When you get into really intensive applications, yes, there is a
(barely) noticeable improvement.
HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.