Phenomenon Noticed after converting Apt. outlets to screw connectors

I have the following voltmeter:

formatting link
A while ago, I posted a thread about the effect of converting "back- stab" household receptacles to their side-screws, or upgrading to pressure-plate backwired outlets.

I now have three years of data indicating that my consumption(in average daily kilowatt hours) has not fluctuated much from IE: January

2005>Jan 2006>Jan 2007, etc. The biggest factor in my bills is the baseboard heat, the fridge, and the AC units in summer. Not to mention that CL&P's generation rate is now twice what it was three Januarys prior! In addition, I'm slowly converting higher wattage 75-100W bulbs(kitchen, entry way) to 15-20W(75W equiv) fluorescents.

The phenomenon mentioned in the Subject header is simple: Pre- conversion, I was getting 121-122VAC readings on the radioshack multimeter. Currently, after converting all non-switched receptacles to side-screw or back-wire, I'm now getting readings between

124-125VAC!!

Is that safe? The meter always did read a little hot(a fresh out of the sleeve AA battery would read 1.58 > 1.61 on the multimeter). Should I purchase a second multimeter and see if it returns lower/ different readings? Still, the voltage readings at the outlets are

2-3VAC higher than before all this converting went on, so I know my TVs/Stereos/Appliances are getting the most voltage they can in this apartment.

Your comments on the multimeter or the outlets are welcome.

-ChrisCoaster

Reply to
ChrisCoaster
Loading thread data ...

formatting link

That meter has an accuracy, when new, of +/- 1.5% of fsd +/- 5 on the lsd, when on ac volts.

Assuming that you are using the 200v range, that is an accuracy worse than +/- 3v. A true voltage of 123 volts could read anything from below

120v to over 126v - so worrying about any variations in measurement within that range are pretty pointless, with this meter.

Buying a different meter is pretty pointless too, unless you buy one which has a certified accuracy better than the variations that you are trying to measure.

Don't worry about it - the voltage coming out of your sockets seems fine. Worry about something more important - like if you will actually be able to afford to use electricity in twenty years time....

Reply to
Palindrome

Is this with the outlet loaded by a substantial load? Or with the meter and nothing else plugged into the outlet?

Any resistance caused by a backstab receptacle is going to be far less than the internal resistance of the meter. So unloaded outlet measurements are basically just measuring voltage at your service panel. These may be higher than they were before simply because you have reduced your lighting load over that time, so there's less voltage drop across your feeder wires. It's also possible that your neighbours have reduced their power consumption, so there's less voltage drop in the transfomer that feeds all of you, or in the wires that feed the transformer. Or maybe your neighbourhood is drawing *more* current and someone changed a transformer tap at your substation. There's no way for you to determine which of these affected your voltage. (Well, you could swapp all your CFLs for incandescents again and see if the voltage drops).

On the other hand, if you are measuring outlet voltage under load, the better connection between wire and receptacle might account for a couple of volts difference in loaded voltage. Probably not.

125 V is perfectly safe. Everything in your household should tolerate a 10% error above or below 120 V, at least.

And fresh alkaline batteries *are* more than 1.5 V, so this doesn't indicate that your meter is inaccurate. Even if it is, the error is likely different between AC and DC mode, and different between 2 V and

200 V scales.

Dave

Reply to
Dave Martindale

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.