When people calculate out how much they're going to save on LEDs (or not, as the case may be), they always seem to use the total wattage of the fixture in their calculations.
My fixtures never hit their max output, so they're never drawing their max wattage, and even the max output that they do hit is only maintained for 4 hours a day. The rest of the time they're using significantly less than their listed power draw capacity. My radions trip my GFI circuits, so they're not plugged in to my apex unfortunately, but it would be interesting to see what that ramp up and down does to their actual power draw over the course of a day.
On another note, does anyone know whether it's better from an electricity point of view to run more watts of heater, knowing that they'll be running for a shorter period of time, or the less wattage possible, knowing that heaters will have to run longer to maintain temp? I've got 2 300 watt heaters on my system, and when the Apex turns them on, they're usually on for 20 minutes or more. I'm wondering if I should add the other 300 watt heater I've got sitting in a drawer as backup to cut down the total length of time the heaters are running.
|