Chosing a new PC

I use the antec PSU calculator at

formatting link
will give you a good guide. You don't have to buy an Antec.

Reply to
Mark
Loading thread data ...

Why?

I can't tell you why -- I'm not an electronic engineer -- but I can tell you that the efficiency graphs you see for modern PSUs drop off noticeably as the current drawn drops below about 20%. Typical PSUs are most efficient at around 50% load, and the efficiency drops off slightly as load increases to 100%, and more rapidly as it drops toward

  1. .. OK, so "awful" was perhaps a little strong ...

An 80+ "Bronze" certified PSU (the lowest level of 80+ certification) has to be 86% efficient at 50% but need not be more than 81% efficient at 20% load.

Using Antec's own online PSU calculator thingie to rate a PSU for the spec that Phil posted here I get a power requirement of 171W (I'd guessed about 150W ... but Antec's figures are generous). Antec say

| The total PSU Wattage this tool recommends will give a general | idea of the range of continuously available power (not peak power) | at which you should be looking.

The sensible thing to do is to pick a PSU for which 171W (or 150W, if you prefer my guess) is around 50% of the rated power -- so for Phil's system that'd be a PSU in the 300-340W range. A more powerful PSU will cost more and will waste more power, and so will run hotter which will likely reduce the lifetime of the PSU.

I'd suggest that (say) a Seasonic 330W PSU would be much better choice, it's one of the few really well made PSUs that has a sensible power output.

Cheers, Daniel.

Reply to
Daniel James

built a couple of systems for me a few years ago, before I realized that modern PCs are as easy to build as Lego. They have a similar "pick-and-mix" website and deliver the machines fully-built and tested.

They're a bit further up-market -- so pricier -- than lambda-tek seem to be, but they certainly know how to put a system together!

Cheers, Daniel.

Reply to
Daniel James

Must ... resist ... lecture ... mode

Gaaaaaaah!

As someone who had had the 'pleasure' of working with SMPSUs for some time I can add a bit to that.

The basic principle is that an inductor draws energy from one capacitor then releases it to another. The more time taken to draw the energy compared with the release time the greater the output power (not necessarily voltage) will be. Optimum efficiency for this occurs if the times are equal - that's the perfect world situation!

In reality as the power increases resistance losses in the circuit and all the control devices start to degrade performance, giving you a slow downward slope starting from between 50 & 70% depending on the design. Also the performance of the switching circuits degrades a lot as the 'draw' pulse gets shorter.

There is also a 'standing' power requirement simply for the thing to work at all. The higher the capability of the PSU the higher this standing power will be, regardless of how much is actually used.

On top of this there is a problem controlling very narrow times for the inductor in that you can easily draw too much power for what is demanded. Un-corrected that would make the output go over-voltage. Early supplies actually had a minimum current specified so that they wouldn't literally pulse the output - not good for the equipment!

The traditional way to deal with this is to add a shunting circuit that would deliberately waste power if the demand was below a certain figure.

Ahhhh - I feel better now :)

Reply to
Folderol

Well, the macbook air edge is pretty sharp - not tried shaving with it but...

Darren

Reply to
D.M.Chapman

On 12 Mar 2011, Daniel James stated:

Not so. My current server has a motherboard which, fully populated, supports 384Gb -- and it's two years old. (It has 'only' 24Gb in it. I feel so short of memory sometimes.)

There are server-class boxes out there with terabytes of RAM now -- although they are hardly consumer-grade.

I've never entirely understood what USB hubs are supposed to bring to a monitor? The ability to build in tinny speakers and SD card readers? (The latter in particular seems deeply arbitrary.)

Reply to
Nix

Extra USB ports, perhaps in a more convenient location.

Reply to
Richard Kettlewell

On 15 Mar 2011, Ron Lowe outgrape:

Agreed. It's particularly useful if you want to run VMs.

Linux has done the 64-bit thing since 1996 or thereabouts (I think the first 64-bit port was the Alpha, followed by SPARC64). It ran on x86-64 before any chips existed ;} and these days the x86-64 port is rather more actively maintained than the -32 (though both are 'top-tier', as it were, the developers pretty much all do their work on 64-bit boxes).

... and has reasonably good support in the open Linux drivers. (The only thing that isn't supported and never will be is hardware video decoding.)

Reply to
Nix

The GNOME monitors panel capitalises the L and lowercases the i, so i can see they're definitely saying Liyama. The manufacturer's domain name is all-lowercase in google results, so i can see they're definitely saying iiyama. I haven't gone as far as copying and pasting the strings into hexdump, i must admit!

Another reason to stick to Samsung.

tom

Reply to
Tom Anderson
[ATI video cards]

But hopefully there will be OpenCL video decoders and processors before long?

Reply to
Tony Houghton

You plug your keyboard and mouse into the monitor, then the monitor into the main case. You put the monitor on your desk, and your case on the floor. Reduces cable clutter. One of the great Apple innovations was ADB, which let you do that back when PC users were dragging vast great serial connectors around their offices.

Not so helpful if you use a wireless keyboard and mouse, of course.

I find the whole idea of SD card readers pretty baffling, to be honest. If a device stores data on an SD card, it almost unfailingly also has a USB port, and it's far easier to plug the device in via that than it is to fiddle about with tiny little bits of plastic.

The one exception i have to that is replacing the card in a mobile phone, when it feels safer to do copy files from the old card to the new card on a computer, with the device sitting inertly switched off.

tom

Reply to
Tom Anderson

I use it for the mobile phone charger. And things that have silly short USB leads, such as cameras.

Reply to
Bob Eager

What's that electric welding sewing machine / punch and die hybrid thing he's got?

tom

Reply to
Tom Anderson

Great Vid. Well worth the time to watch :)

Reply to
Folderol

My camera only does USB1.1 - well worth taking the card out and shoving it in the reader.

At school I use half a dozen different cameras, three or four different proprietary USB leads, which we don't usually leave with the cameras 'cos they get muddled up and abused. Again - much easier to pop the card out and shove it in the reader.

The school's digital photo-frame on the reception counter uses an SD card but it'd be a pain to take the whole thing to a computer to change the photos - so a reader is good there, too.

Reply to
Skipweasel

I think you got it in one - a hybrid. The bloke's clearly very fond of making special tools.

Reply to
Skipweasel

Well I did get the Novatech box and *it* didn't give me a decent display either. At which point some clue must have wormed its way into my brain and I tried plugging the monitor directly into the graphics card instead of going through the KVM box and - waddaya know! - 1600 * 1200! Words cannot describe how stupid I felt.

How does the PC know what sort of monitor it's driving? I didn't realise, or think of it, that there must be some way the graphics card can query the monitor for its vertical and horizontal scan capabilities. I've certainly, at one time, had those hard-wired into my xorg.conf file (and tried it, without success, with the current setup - I guess maybe xorg ignores that now if it thinks it knows better?).

So now I have a couple of half-decent machines: the Novatech one (quad core, 4G, but rather tinny box) and the old Targa (AMD64 single core, 1G, nice box). But you can never have too many PCs ... maybe I will get around to making that media server so SWMBO & others can watch stuff on the TV from it.

Reply to
John Stumbles

*uk*.comp.os.linux, surely?
Reply to
John Stumbles

Oops, sorry! I thought I had remembered to qualify that remark to exclude server boards.

With what size RAM DIMMs? Can you actually achieve that capacity, today, at any price? If you have 24 slots (which is not impossible) 24x16GB DIMMs would do it ... and only cost £10k-£15k.

If I change "thousands" to "hundreds" my point still stands -- 256TB is enough addressing space for today's PCs.

Indeed they are not.

I suspect boxes like that will be running several CPUs in separate address spaces, and so aren't strictly relevant to this discussion ... but I'd be interested to see the spec of such a machine.

I think the idea of putting USB hubs into monitors comes from a time when typical PCs had limited USB ports but were supplied with USB keyboard and mouse ... something that supplied extra USB capacity and put it on the desktop where the KB+M live was a selling point.

Today it's not a big deal ... but if you have a need for a USB hub at all it saves a bit of space to have it built into the monitor rather than having yet another little box with a nest of wires poking out of it to clutter up the desk. Same deal for cardreaders (I rather wish my Dell U2410 had a CF card slot, like the old Ultrasharp 2408 had ... maybe it's time to upgrade my DSLR?)

Cheers, Daniel.

Reply to
Daniel James

No, no, we're all ears!

.. and I feel enlightened. Thanks.

Cheers, Daniel.

Reply to
Daniel James

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.