Can anyone please try and explain, In percentage terms or however works best, the differance in power consumed when a T.V. is left on standby overnight compared to if it was on and being viewed?
No because different machines have different power consumption on standby. Some older ones used a lot of power up to 20 watts. Modern ones will use say 5-10 watts about the same as a low energy light bulb. Some get down to 1-4 watts.
The actual power used by the TV varies hugely too. A 40" plasma will use say 300 watts
Thanks jake, I had an idea it was something like that, convincig the wife is another thing. She thinks we are going to end up in the poor house because whereas we used to turn everything off at the wall every night, I now think it's better to leave the modem and set top box "lit" and turn off T.V. and computer. It can't be good for a set top box or modem to tell them every night, Nah, we don't want it, and then a few hours later say, Oh yes, we do realy.
Some TVs have a standby-power of only 1 Watt which is only around 1% of the typical on-power of a TV, although actual figures vary widely. A continuous standby consumption of 1 Watt will cost about 7p per 1000 hours ( assuming 1 kilowatt-hour of electrical energy costs 7p)
Things like the set top box and the vcr have clocks and tuning setups which can be upset by frequent disconenction from the mains. They are designed to be left on. The tv is generally more able to cope as it has no clock or timer like a video say.
In many AV consumer electroncs devices, the cheapest brands tend to have poorly rated capacitors in the power supplies which can suffer from frequent heating-cooling cycles present after disconnection.
I would disconenct the telly by switching of in the set, and leave the rest plugged in.