In the home where electrical usage is metered by the typical rotating disk electric meter, the only cost savings I can see would result from the following factors:
Electric motors are designed to operate at a specific voltage, say 120V, and run less efficiently at lower voltages. So for example a refrigerator would use more energy maintaining the same temperature if the voltage dropped to 110V.
A reactive load will draw more current which will not in itself effect the operation of the refrigerator or its measured power usage, but the additional current will cause the voltage across the motor to be lower due to the electrical resistance in the wiring between the refrigerator and the power company transformer.
In addition to the reduced efficiency of the motor, there will be wasted energy in the heat generated in the wiring. That part which occurs in the wiring between the electric meter and the refrigerator does represent measured energy usage and is charged to the customer. (Which may not be so bad if you live in a mostly-cold climate as that wasted energy will go towards heating your house.)
(The cost of wasted energy which occurs in the wiring between the power company transformer and electric meter has to be eaten by the power company, which is why they add a surcharge for reactive power.)
FYI, the amount of reactive power is generally expressed by the quantity "Power Factor" (PF) , which is the ratio of "real" power to the product of voltage and current. With a purely resistive load, the PF will be 1. With a purely inductive (or purely capacitive) load, the PF will be 0.