As I mentioned in my previous post on lighting, electrical devices generate heat, and in climates that require air conditioning you incur the cost of the air conditioner electrical usage on top of the electrical usage of the device itself.
For example if you were to take a 100 watt lightbulb. Since very little light escapes the house, we'll make the assumption that all the electricity used by the bulb ends up as heat added to the house (more on this assumption later) To convert watts to BTU/hour you multiple times 3.43 and since we leave the light on for 1 hour, we generated 343 BTUs. Say your AC unit is a 10000 BTU unit (meaning it's rated to cool 10000 BTUs/hour) with an energy efficiency rating (EER) of 8.3 which means your A/C unit uses 1200 watts. For now lets assume with the light bulb on your ac unit is running constantly and maintains a stable 75 degree temperature. 343 BTUs of the 10000 it is cooling are due to the light bulb so 3.4% of the work it is doing is cooling the light bulb. 3.43% of 1200 watts is 41.16 watts for a combined total of bulb and a/c of 141 watts. So with the parameters we chose for the air conditioner, you are paying an extra 41% on top of the amount of power you pay for the light bulb directly.
So if you left that light on 10 hours per day for 100 days in the summer, how much would it cost if you pay $.20 per kilowatt hour? $28.32!
The same calculations are applicable to computers where basically all the electricity is converted to heat. This can be very important when designing datacenters, server rooms and offices, where computers can be on all the time, and the air conditioning becomes a huge factor in the cost. Monitors, TVs, DVD players, all convert most of their power used to heat. Some gets converted to light or sound, but this is still energy that gets converted back to heat when it impacts your walls!
Did I get my math right? Don't agree with my assumptions? Post a comment!