Reply to post:

Standby consumes MORE POWER THAN CANADA: IEA

Bob 5

In the 70s I had a 25" colour TV that consumed 250W. Now I have a 55" colour TV that consumes 43W. Similarly standby power consumption on devices like TVs has reduced from tens of watts a decade or two ago to normally less than a watt now, (Last time I checked, European directives specified 0.5W/1W max. standby power depending on application), so energy waste is being tackled and there is no doubt standby powers can be further reduced to microwatt levels. TV/Radio etc settings can be stored in NVRAM leaving only a micropower infra-red receiver functioning powered by a micropower capacitive coupled low loss psu awaiting the wake-up command.

In a switch mode psu, burst mode can only reduce consumption at low output levels by reducing switching losses - the efficiency of any psu at zero output is still 0% - the aim is, as always, is to reduce the standby consumption.

Unreliability caused by frequent heat cycling of intermittently powered devices is something that has to be eliminated by design - the aim being to make it unnecessary to permanently power a device to prolong its reliability - that is sooo wasteful. In my experience such unreliability isn't a great problem anyway - my TVs, computers, radios etc are all turned off when not in use and I have not experienced any untoward failures. Remember in electronic devices there are items that have a limited life, in particular standard electrolytic capacitors, (e.g. motherboard capacitors.....), so in such cases any unreliability caused by heat cycling could be more than offset by the life gain of switching off when not in use. Check the manufacturers rated life of electrolytic capacitors - you may be surprised.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019