Set-up: Combi/condenser boiler, solar-heated water supply to boiler inlet averaging 35 degrees C (i.e. can be 20 degrees on overcast winter's day, can quickly reach 65 degrees on a sunny morning).
Is there any way to estimate how much more gas is consumed in raising the temperature from 35 degrees to 75 than if raising it to just 55 degrees?
The reason I'm asking is that our condenser boiler always heats the water to 75 degrees, despite its temperature control being set to 55 degrees or lower. It's not suppowed to kick in at all if the solar-heated water feeder tank reaches 55 degrees, but it does.
The explanation we've been given is that the water pressure (20 litres per min at the cold water tap) drops to 6.5 litres per minute at the hot water tap, i.e. when it's passed through the solar-heated hot water tank. The engineer says this means the water passes through the boiler too slowly and so gets too hot. I can't quite believe this, as surely when the hot water tap isn't turned fully on it'd have much the same effect, i.e. reduced flow?
As far as I can tell, whatever environmental and cost-saving advantages there may be in this system are being offset by wasting what I assume is a serious amount of gas over-heating the water.