Here's my setup: a 12 year old forced hot water oil boiler, with a gas conversion burner, and connected to that is a 40 gallon indirect fired water heater. We've been heating our house with a wood stove for the last two years, so all the boiler does is heat our hot water and serve as a backup for our wood stove.
When the plumber installed the water heater, he turned the low point on the boiler all the way down. The boiler kicks on when the water in the boiler goes below around 65 F degrees. High point is 160. Water heater is set at 127 degrees F, which gives us 120 degree water at the faucet. The heat loss for the water heater is 1/2 degree per hour.
The water heater calls for heat when the temp goes 10 degrees below the set point. So the boiler may sit for hours without calling for heat, so the boiler temp goes down to room temperature again. So for our hot water, the boiler is heating its own 6 to 8 gallons of "boiler" water to 160 degrees in order to heat the water in the water heater. That seems like a lot of energy going in to heat the boiler up every time we draw hot water.
To remedy this, I installed a timer that turns on the water heater once a day for an hour. The 40 gallons of hot water gets us through the day (so far, but I just installed the timer last weekend). My friend says this won't save any energy, because warming up the hot water tank from a low temperature will take more energy than heating it up throughout the day, where the temp differential is lower.
A drawback is that the hot water controls will be turned on daily instead of being left on constantly, which may reduce its life somewhat. It's got a solid state control unit with a LED control panel.
I could get a device that measures the amount of time my boiler kicks on to find out which way is more efficient, but before I shell out for that, I thought I'd see if anyone had any ideas. Anyone have any guesses which way is better? Thanks!
Jay - jayroperman a t h o t mail dit c om