Today's computers with their Dual or Quad core processors, and lots of Ram, extended video cards, high wattage power supplies and so on, appear to use a lot more input wattage and amps from an outlet, compared to the common Pentium 1 series computers sold around the year 2000.
I am using such a computer, which was new in 2000. It has 512mb Ram (Maximum it can use), came with a 100W power supply, (Which died and I replaced with a 300W), and it has 2 fans, one in the power supply, and a tiny one on the CPU.
These modern dual/quad core machines MUST require considerable more wattage to run. Most have 3 or 4 fans and kick out a lot of heat compared to the slight warmth I feel from my fan.
However, I'm looking for some actual facts about how much power is used in this comparison. (A 2015/16 computer VS one from around 2000). I'd like to find some actual data.
While computers are low energy users compared to many other appliances, they are often left running around the clock, and that adds up.
One saving point is that the old CRT monitors used more power than the modern flat screen ones which use either LED or Florescent to light the screen.
Yet, even when the monitor is considered into the overall picture, I suspect that modern computers still use considerably more power, just based on the heat they kick out. (Of course in winter, that heat adds some heat to the house, but in Summer is will require more cooling of the home.