Thanks for all the advice about the SSD. I am starting a new thread as the old one is getting quite lengthy.
I now have my SSD (Samsung 870 Evo) and mounting bracket. I plan to leave the OS and programs on the existing SSD (also Samsung) and use the new one for data.
Would the following method be competent?
Copy entire contents of D drive to external drive (there is plenty of space)
Remove existing mechanical drive and install SSD using same cables
Format SSD using Windows 10 (as NTFS)
Copy entire contents back from external drive
I have no interest in retaing the existing hard drive as I prefer external back-up.
If you have a SATA CD/DVD drive installed you can temporarily disconnect that and use the cables as a temporary connection for the new SSD to bypass the need for the intermediate copy to an external drive. When the SSD has been formatted and populated you can use it to replace the hard drive and reconnect the CD/DVD drive
You might want to review the relative performance of new and old drives.
You will get optimum performance with the OS and its swap file on the fastest drive and with it on the fastest SATA connection your PC has. (that may need a new SATA-3 capable cable)
Yes, but if you can verify the copy is a faithful reproduction. Trust nothing unless you can verify it.
That may not be good enough if the old drive wasn't SATA-3. It will work OK but it will be operating at a half or one quarter of its rated speed. They are backwards compatible but no point in buying a very fast drive and running it on a B road when motorways are available.
Personally I would always put the OS onto the fastest drive and only the most frequently or awkward bulk data onto an SSD. My endgame tablebases for chess are on a fast SSD since they require insane numbers of short records to be accessed quickly and spinning rust can't hack it. The seek time being near zero on SSD's makes them incredibly good for this task.
Bulk data that you hardly ever access is a waste on a super fast SSD. YMMV
FWIW my spinning rust disk is still online inside my PC but only ever gets accessed now for seldom used legacy data that is still on it and no longer on the much faster (and smaller) SSD.
When you have finished you should run a speed benchmark to prove that the thing is properly installed and operating at full speed. Just don't do it too often - speed tests contribute to wear on the drive.
Note that there's no such thing as specific SATA1/SATA2/SATA3 cables, they're all just SATA cables, there are of course decent SATA cables and shitty SATA cables ...
Your procedure causes minimal disturbance to the dust inside the PC :-)
Approved.
*******
Seriously though, if a machine has only USB2 on the back, I try to encourage people to buy an add-on USB3 card. As the rates are a bit better that way, and people are more likely to make backups to their USB3 enclosure, if the backups don't take too long.
But you're not expecting miracles from this stuff.
My 500MB/sec SSD drive, does about 380MB/sec.
My 500MB/sec USB3 port on the back, does about 200MB/sec with the enclosure. That's the median performance level. There are enclosure chips that go a bit faster, but it takes additional research when buying an enclosure, to get a good chip.
When I see 200MB/sec rates, well, at least it's pretty close to the HDD rate. USB2 is limited to 30-35MB/sec by comparison.
The USB3 card, can't go faster than the PCI Express slot revision it is plugged into. That's one reason the card may not live up to its full potential. You might see 180MB/sec if the motherboard sucks.
I've also seen PCI Express Rev2 x1 lane slots, switch down to Rev1.1 at boot time. Which is very annoying. And a second reboot may result in the usage of Rev2. You only notice stuff like that, if you do a lot of benchmarking.
If you're running WinXP, then you want a USB3 card with a NEC chip on it. The NEC product is one of the few, with a WinXP driver. The NEC company, was in the process of changing the company name to Renesas, the day the USB3 chip came out.
When I fitted an SSD I backed up the HDD then cloned the HDD to the SSD using Macrium Reflect. Shut down, off at the mains, removed HDD and put in SSD (SATA-type, so in the bay easily), booted, 35s later there was everything as it had been. Made sure that there was no defrag on and checked it was OK with Samsung Magician and away.
I used SATA cables to connect FPGAs, where I got bit error stats. It was completely pot luck as to what was good quality - there was no correlation between price/brand and bit error rate. More of an issue was fit in the sockets - even a slight bend can make a big difference. You don't notice this in a PC because the error detection/retransmission on SATA is good enough to compensate.
Although an honourable mention should go to the Maplin cable was had sufficient crosstalk between the pairs that the receiver could detect a signal even though the other end was dangling in the breeze. That type did not get used in the final build.
I came across an incompatibility where there were occasional ( a few per Tbyte of data transfer) SATA errors with a 1Tbyte Samsung 860 Pro in a Microserver Gen 7. Using 850 Pros there were never any errors. The errors caused the SATA link speed to halve, after which the errors would stop. The same drive was fine in a variety of other machines. Updating the drive firmware didn't help. John
When I fitted an SSD I backed up the HDD then cloned the HDD to the SSD using Macrium Reflect. Shut down, off at the mains, removed HDD and put in SSD (SATA-type, so in the bay easily), booted, 35s later there was everything as it had been. Made sure that there was no defrag on and checked it was OK with Samsung Magician and away.
Thanks. Given it is data that is to be transferred, I think I would rather use Windows to sort out any fragmentation and make me aware of any corrupted files.
If there are errors, then the GNU version of ddrescue
formatting link
be very effective at getting as much data off the disc as possible in a way that minimises the risk of completely trashing the drive with repeated re-reading of bad blocks.
Letting Windows anywhere near a damaged disc is the last thing you want to do. It will try and fix problems without asking, often making them worse in the process. I made that mistake enough times that I did finally learn. John
Totally correct. what your PC sees is no longer anything but a *model& of a hard drive. The actual ram, blocks can be mapped to in anyway the SSD pleases. And will change with time as the wear levelling kicks in.
You don't want Windows (or anything else) to defragment the data on an SSD.
The purpose of defragmenting is to bring different parts of the same file together, so that the disk heads don't have to spend time moving between cylinders and waiting for the next sector to come around. It also means that gaps are not split all over the place and files don't therefore end up fragmented when they are saved.
An SSD has effectively 0 seek time, therefore there is no point in defragmenting, as jumping from one sector to another elsewhere does not involve any head movement.
Defragmenting an SSD simply creates extra chances to introduce errors and causes extra wear to the memory cells, decreasing lifetime.
HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.