the problem from my book says: customer uses avg 750 kVA in 720 hr-month. Pwr factor = 0.8. what is the monthly charge? real pwr = apparent pwr (PF) = 750 (0.8) = 600 kw. the charge is $0.04/kwh. 600kw (720 hrs) (.04) = $17,280 monthly bill. Now power factor is improved to 1.0. what will the savings be. I did the first part i just dont understand how the cost decreases. if the equation for real power is P= (V)(I)(PF) or app pwr (PF) a 1.0 PF does not bring the 750 kVA value down but makes it 750 kw over the same amount of hours. The cost goes up. Im messing something up here, not sure what.
Thanks for the help
Thanks for the help
-
The "savings" are mainly for the Utility supply company. With PF=0.8 they had to generate 750kVA but could only bill you for 600kW...
With PF=1.0, they can bill you for 750kW...
But you probably only consume 600kW of real power on average, so your actual bill will remain about the same. PF correction will save the Utility the cost of generating the extra 150kW of power over the 720hrs per month = 108MWh.
Then you need to know what their generation costs are. Less than $0.04/kWh of course! Maybe about $0.01/kWh = $10 / MWh, saving the Utility company $1080 per month!!
With PF=1.0, they can bill you for 750kW...
But you probably only consume 600kW of real power on average, so your actual bill will remain about the same. PF correction will save the Utility the cost of generating the extra 150kW of power over the 720hrs per month = 108MWh.
Then you need to know what their generation costs are. Less than $0.04/kWh of course! Maybe about $0.01/kWh = $10 / MWh, saving the Utility company $1080 per month!!