Wattage of Todays Computers V/S Older ones

Today's computers with their Dual or Quad core processors, and lots of Ram, extended video cards, high wattage power supplies and so on, appear to use a lot more input wattage and amps from an outlet, compared to the common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram (Maximum it can use), came with a 100W power supply, (Which died and I replaced with a 300W), and it has 2 fans, one in the power supply, and a tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more wattage to run. Most have 3 or 4 fans and kick out a lot of heat compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used in this comparison. (A 2015/16 computer VS one from around 2000). I'd like to find some actual data.

While computers are low energy users compared to many other appliances, they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the modern flat screen ones which use either LED or Florescent to light the screen.

Yet, even when the monitor is considered into the overall picture, I suspect that modern computers still use considerably more power, just based on the heat they kick out. (Of course in winter, that heat adds some heat to the house, but in Summer is will require more cooling of the home.

Reply to
Paintedcow
Loading thread data ...

It's exactly the opposite of what one would think but todays machines are often considerably more power efficient.

A few years ago I built a quad core machine for my wife and the mobo power draw was something like 25 watts

Reply to
philo

philo wrote: ...

the higher end processors get up to around double or triple that, but i think the most power used these days is in the graphics boards that some people want for doing gaming and number crunching or bitcoin mining.

so if you really want to have a very efficient computer go with on board graphics and don't expect it to scream, but i think it would still get the job done for most people doing internet, e-mail, watch a dvd/blueray, listen to some music, etc.

as i'm looking to eventually replace this beast i'll be doing much the same evaluation. i'd like it to be as energy efficient as possible and also quiet would be nice.

songbird

Reply to
songbird

That does come as a surprise. I thought my 300W power supply would handle most anything. I probably killed the original PS by plugging in too many hard drives and other add ons.

But I guess I was wrong on that note, because I recently read that most computers now have 500 or 600 watt or larger power supplies. That alone would tell me that the new computers suck lots of power. So if your mobo is only using 25W and hard drives and similar items are using about the same as they always did, why are power supplies so much larger now?

I have not actually opened or used any computer newer than maybe 2010 or so, but I had to help someone fix a Dell from around 2010 (dual core), and that thing could have been used as an electric heater. The CPU alone was hot enough to fry an egg on it, because one of the 4 fans died.

Maybe they have become more efficient since 2010, because that beast must have sucked about 500w just to play simple solitare that came with Win XP, (running XP Home) because that's about the only thing that person did with their computer, yet it was pouring heat out of the case like a 1500watt space heater. (and amazingly, was six (or more) times slower than my year 2000 pentium made for Windows 2K).

One of the main reasons I dont want to upgrade (besides not liking bloated operating systems), is because my electric bill is already very high, and go on and off the computer daily between other work I do, so it dont make much sense to turn it on and off 10 or more times each day. So I just leave it on (except the monitor, which I shut off, when I'm not using the computer).

Reply to
Paintedcow

My I7-4770 puts out only abut 70 watts. The entire wattage is about

450. You can use this to get watts.
formatting link
There are others.
Reply to
Vic Smith

"songbird" mentioned that a good video card can take a lot of power and that's true.

Most gamers would want the best they could afford.

I save money there ...as the only game I play is a 25 year old version of Tetris

There is no harm in going bigger than what you need, and though , if I have one...will use a 500w supply...but for general use 300w should be fine.

Power supplies typically do not burn out from being overloaded, it's usually a surge or that a component just plain fails.

Though I do a lot of computer repair work and a dead PSU is pretty common...and sometimes I'll even see a PSU and mobo go at once (eMachines) ...last year something kind of scary happened to one of my own machines which was fortunately just a spare and had been backed up in several places.

It had a good name brand 500w PSU and was on an industrial grade UPS but it just plain failed upon being turned on...it took out two hard drives! There is supposed to be built-in protection for that!

My electric bill here is slightly on the high side in the winter because my wife has a 2000watt baseboard heater in her studio. Because I keep the house cooler than she likes, I don't mess with her own private space. Even though there ore often tow or three computers on at a time, they do not add a fortune to the bill...but I id give away quite a few HUGE servers because I knew they would be major power hogs.

Anyway I have been using those inexpensive mobo/cpu combos and they work well and take very little power. My wife's machine has 32 gigs of RAM and she can easily run several "heavy" apps at once with no bogging down.

Reply to
philo

Also it is surprising but a PC will draw more power when it is thinking hard.

Hook up a current meter or kill o watt etc to the PC then watch it. Grab the top of a window and slide it around the screen and watch the current.

Mark

Reply to
makolber

The wattage rating of the power supply does not indicate the actual power usage of the computer. Just like a 200A electrical panel doesn't mean you're using all 200 amps, or a 200 watt stereo doesn't mean you're cranking it to full power. That's just how much power it is capable of producing.

Newer computers are actually a lot more energy efficient than the old ones, despite the increase in speed.

My old computer pumped out a lot of heat, enough that it kept my feet warm under my desk and the heater in my office rarely needed to run. My new computer puts off so little heat I can't even feel it.

I have my computer on a UPS so I can monitor my exact power usage. I have an i7-4790K, 16GB RAM, 2 SSD's, 2 hard drives, an external hard drive, cable modem, router, and an older LCD monitor connected to it. Under normal use the whole system averages around 90 watts. It will jump up to

150 watts or so when processing video, but it drops to about 60 watts when I turn off the monitor and the hard drives power down.

For what it's worth, I have a 650 watt Antec power supply, so you can see my actual 90 watt usage is no where close to the power supply rating.

My previous computer (based on an i5-2500K) averaged around 120 watts under normal use. I know that was an improvement over my old Pentium 4 computer, but I don't remember how much it used.

Yep, I leave mine on 24/7 since it controls lighting in our house, and records TV shows. At 60 watts overnight that's about the same as leaving a light bulb on.

These days graphics cards are the one of the biggest power users, especially high end cards used by gamers. I don't play games so I use a fanless Asus GTX750-DCSL-2GD5 video card. It uses very little power and is absolutely silent.

I also use Gelid FN-PX12-15 120mm fans for my CPU and case fans, and they're virtually silent under normal use.

Anthony Watson

formatting link
formatting link

Reply to
HerHusband

There are multiple factors at work here. While the CPUs have grown far more complex, transitor count has grown exponentially, the feature size has continued to shrink, which is what makes that possible. So, more complexity would mean more power, but smaller transistors take less power. That same driving force has lowered the voltage they operate at from 5V to 1V, which obviously is a big factor. Overall, I think you'd find that today's similar purpose CPUs probably take the same or less power to run than their predecessors from 25 years ago. At the same time, CPUs for special purposes, eg notebooks, have used technological progress to tradeoff some compute power for much lower power consumption.

I just checked the power supply on my 6 year old HP I7 based PC and it's 460W. That's about the size that power supplies have been all along. But I would think it's also possible that while they need that as a theoretical peak, the I7 isn't using 4 cores and running anywhere near max power as I sit here and browse the internet or do similar. I'd bet that today's PC typically runs at lower power actually being used than an old 386 system. Both probably had about the same max size power supply though. PC and chip manufacturers have been working to put in energy saving features, make them energy star compatible, for decades now.

To save power, do you have the power settings set correctly in control panel? I have mine set to put the display to sleep and the CPU to sleep after 15 mins of no activity. It wakes up in about 2 secs, by touching any key. That should save you a reasonable amount of power, with no real downside.

Reply to
trader_4

The tower I use, came with about 180 watt supply. When it went, I put in a 200. And last week it went, again. The computer store near me, Staples, had 350 watts as the smallest size.

Power supplies seem to last 5 to 7 years, for me. I'm okay with that.

Reply to
Stormin Mormon

Most PC power supplies are rated a lot higher than the typical load they handle. I don't remember the exact numbers but when I was checking the components in my entertainment center a Dish 722 DVR/receiver used more power than a Compaq 600 P4 3.0mz machine playing a video. (The TV is the "monitor" so that was not an issue) They were both around 100 watts tho. You can figure this all out with one of those Killawatt devices and they are getting pretty cheap these days.

formatting link

Reply to
gfretwell

I don't have a watt meter so if I want to check power, I run the device off my UPS and simply measure battery current and multiply by it's voltage, then figure the device is using 90% of that

Reply to
philo

+42

There has been a steady march towards smaller "device geometries". I.e., the "fineness of detail" INSIDE the various "chips"; smaller tends to mean faster and also REQUIRE lower operating VOLTAGES. As a result, some of the instantaneous CURRENTS drawn by the CPU, etc. are outrageous. And, one of the reasons you see "bad capacitors" causing the premature "forced retirement" of many otherwise "good" computers.

Put 8 or 10 of them in a room and you'd be surprised how warm it can get! :>

In my case, I tend to have lots of spindles. Often 15K (RPM) drives so the drives themselves draw ~20W. Video cards tend to be the new power hogs, though. Esp folks who are into gaming.

Each of my machines supports at least two monitors -- with my primary workstations supporting *4* (though I only use 3). So, "data presentation" tends to consume more than "data processing".

I have one (headless) box that runs 24/7/365. It provides key/core services to the rest of the machines in the house (DNS, TFTP, NTP, RDBMS, font server, etc.). But, I've been constantly pushing it to lower power implementations. I currently run it on a Dell FX160. As it's not even TRYING to update a video display (and there's none attached!), now runs with a 5200RPM laptop drive (instead of a hotter 3.5" drive), no expansion cards, etc. it's comfortable at about 9W (1.6GHz). As a result, doesn't even have a case fan (power supply is the size of a fat cigar)

I've debated replacing it with a Duo2 but really can't imagine what the other core would *do*! (serving up RDBMS queries is the only truly taxing service that it provides -- and those are infrequent!)

Yeah, I have some graphics cards that are HUGE power hogs. In my case, I want multiple heads, not a fast BLT'er. Often, that pushes me to a more power-hungry card than I would like.

Reply to
Don Y

Power in any digital electronic circuit is usually driven by how often things "change state" ("change their mind"). Each transition (think

1's and 0's) means energy has to be expended to convert the 1 to a 0 (or vice versa).

For example, to count from 1 to 100 uses ten times as much as counting from 1 to 10 (using a grey code -- a representation in which only one bit changes with each change of value; "binary" allows many bits to change at the same time. I.e., counting from 7 to 8 -- four bit changes -- uses more than counting from 8 to 9 -- one bit change)

Or: CD \ DIR *.* /S to exercise the disk, as well.

Reply to
Don Y

Didn't I remember you're an electrician, or a big battery guy? No clamp on ammeter?

Reply to
Stormin Mormon

Indeed. I have a 4yo Zotec box with a 4-core AMD cpu running four virtual machines (two web servers, a DNS server and a mail exchanger).

Draws 11 watts at idle. 15 watts under load. Measured at the wall.

A considerable amount of engineering goes into designing the processors to be more energy efficient. Most large datacenters are constrained in three dimensions - space, power and cooling. To address space, they want denser configurations (up to 100 servers per rack), but then they run into cooling and power issues.

One of the more difficult areas to address in processor design is the leakage current (i.e. power that is dissipated as heat even when the processor is idle). As processor speeds increase and feature geometry decreases, leakage current rises.

Processors designed for battery sources (in phones, etc) use slower clockspeeds and aggressive voltage scaling, frequency scaling and power gating to reduce the leakage current during idle periods.

Reply to
Scott Lurndal

The key is the TDP for the cpu itself. TDP on Intel CPU's range from circa 20 watts (for low-end desktop/embedded processors) to circa 200 watts (for the high-core-count E-series server processors).

TDP is "Thermal Design Power".

formatting link

TDP's for various Intel processors are here:

formatting link

Reply to
Scott Lurndal

It's not just the processor *chips* (ages ago, CPU was a chip; now its used equally to refer to "the box that has the computer in it"). There have been lots of savings throughout the design of most machines.

E.g., my first PC dissipated ~1W in the keyboard! Each "boxer fan" was another watt (don't forget the one in the power supply). Disks tend to have bigger caches, narrower interfaces and higher transfer rates -- so they can spin slower (less frictional losses). Everything tries to "idle" -- instead of "doing nothing VERY FAST". etc.

Reply to
Don Y

Over the yrs owning/fixing and selling, I've had 3 with bad power supplies. An HP with a 100W that just quit and one Dell from lightning (still putting out power but quirky) don't remember the other one at the moment. My PC's have never had one fail in 18 yrs...only repaired ones.

Reply to
bob_villain

New ones have better high efficiency power supplies. Electronic components are getting smaller and smaller increasing circuit density. So of course new ones use energy considerable less. That points to it also depends on product quality.

Reply to
Tony Hwang

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.