Earth Continuity meter suitably

Post Reply
JamieP
Posts: 478
Joined: Tue Apr 14, 2020 11:08 am
Has thanked: 92 times
Been thanked: 18 times

Earth Continuity meter suitably

Post by JamieP »

For a long time I've known when doing EC tests that a device with low ohms function should be used but I can't for the life of me find anything to back it up, can anyone point me in the right direction, I'm just wanting to explain to others that a general multimeter is not accurate enough for such tests

I looked in 3000, 3017 and 4836 but haven't found anything specific enough other than "suitable device" for the test etcetc
by AlecK » Fri Apr 16, 2021 10:02 am
3017 is the one; but you're right that current edition doesn't explain.
Currently it's being revised ; should be out for Public Comment review soon;
and will include much more info about selection of test equipment.

The issue about suitability WRT earth continuity relates mostly to accuracy.
All meters have inherent error; and we need to ensure that this error isn't large enough to be significant compared to the values being measured.

Analogue meters
If using a true analogue meter - as against a digital meter with pseudo-analogue display - then accuracy is expressed as % of full scale deflection.
So if the error figure is =/-2 %, then on a scale 0 - 200 the error is 2% of 200 ohms (ie 4 ohms).
Clearly not much use if we're trying to determine whether something is < or > 0.5 ohms, to have up to +/- 4 ohms of error.
Switch to a scale 0 - 2 ohms, and now the same error figure works out to be 0.04 ohms;
and this is small enough not to be significant when reading values around 0.5 ohms.

My Megger BM 100/3 analogue IR tester has a 0 - 2 ohm scale, with error stated to be +/- 2.5 % FSD @ 20 degC.
So for EC measurement, and also for EFLI "dead test" measurement of subcircuits
the max error of 0.05 ohms is small enough not to worry about.

Digital meters
DMMs are what most people buy these days; and are inherently more robust for field work.
Problem is that most people see readings shown to several decimal places, and assume that means the reading is very accurate.
Not true.
Once again look in the specs for your meter, and you'll find that accuracy is stated in 2 parts
1st part is % of the value being measured, and 2nd part is a number of digits (also known as 'counts')
And again accuracy varies slightly with temperature so is stated at a particular ambient temp.
Example: Megger BM 120 IR tester; for continuity ranges (auto-ranging) : 0.01 ohm to 9.99 ohm ±3% ±2 digits.
Looking at the % error, 3% of the value measured; so for 0.5 ohm 3 % = 0.015 ohm.
Looking at the digit error, this refers to the "least significant digit" ; ie the one on the right.
This meter displays to 2 decimal places, so each "digit" of error = 0.01.
Obviously max error is when both these parts are on the same direction (though we won't know whether that's the case on any particular occasion). So adding them, with displayed value = 0.50 ohms, means worst case is a total error of (0.5 x 3%) + (2 x 0.01) = + / - 0.035 ohm.
That's small enough that we can ignore it for the sorts of values we are likely to be measuring for earth continuity or EFLI.

By contrast, compare an ordinary DMM that displays such measurements to only 1 decimal place - which is the most you are likely to find in a DMM that isn't designed primarily as an IR tester.
Good-quality meters such as Fluke & similar will typically tiny % error, which can safely be ignored for purposes of EC measurement.
But their "digits" error is typically 3, sometimes 4 digits. And that's 3 or 4 x 0.1 ohm.

Real-world examples:
Extech 430; resistance 400 ohm range: =/-(0.8% + 4 digits)
Fluke 179; for (lowest) resistance range 600 ohm: resolution 0.1 (ie 1 decimal place); accuracy 0.9% + 2 counts

So with a 3-digit error factor; a displayed value of 0.5 ohm means the actual reading is somewhere between
0.5 - 0.3 = 0.2 (absolutely fine for an MEC); and
0.5 + 0.3 = 0.8 ohm - absolutely not OK.

Even for the Fluke, with only a 2-digit error (ignoring the % error 'cos it's tiny); we're still looking at 0.5 ohm +/- 0.2 ohm

Trouble is, we don't - can't - know any more than the the actual value is somewhere between these extremes.
Making normal DMMs, even if reputable brands and of good quality, pretty hopeless for earth continuity.

Worse still, those "fork" type devices that give no decimal places at all.
The "digits" part of the error will be +/- several ohms, making these devices absolutely useless for earth continuity.
That's OK for telling you whether a switch contact is open or closed; but not much more than that.

These error calcs change slightly for every reading ; because the % relates to % of reading
(unlike analogue, where it's % of FSD so only changes when we change scales).
And the specific error values may well be different for eg voltage than for continuity.

But if you think about it, you'll see that when measuring large values, the % error is more significant; while when measuring small values, the digits error matters most. That's regardless of what sort of measurement (amps, volts, ohms, etc); and regardless of scale used .

Bottom line - to avoid having to do all this calculation for every reading - is that for EC measurement, we need a meter that gives readings to at least 2 decimal places; so that the "digits" error is a few hundredth of an ohm instead of several tenths of an ohm.
Which for all practical purposes, means using the low-range of an IR tester, and not a DMM; because DMMs simply do not give resistance readings to 2 decimal places of an ohm.

And if you get into medical installations, , and especially for "electrical medical equipment"; even an IR tester is not accurate enough, and we need something batter again. Typically several thousand dollars worth.
Go to full post
AlecK
Posts: 914
Joined: Thu Apr 16, 2020 11:24 am
Answers: 5
Has thanked: 2 times
Been thanked: 352 times

Re: Earth Continuity meter suitably

Post by AlecK »

3017 is the one; but you're right that current edition doesn't explain.
Currently it's being revised ; should be out for Public Comment review soon;
and will include much more info about selection of test equipment.

The issue about suitability WRT earth continuity relates mostly to accuracy.
All meters have inherent error; and we need to ensure that this error isn't large enough to be significant compared to the values being measured.

Analogue meters
If using a true analogue meter - as against a digital meter with pseudo-analogue display - then accuracy is expressed as % of full scale deflection.
So if the error figure is =/-2 %, then on a scale 0 - 200 the error is 2% of 200 ohms (ie 4 ohms).
Clearly not much use if we're trying to determine whether something is < or > 0.5 ohms, to have up to +/- 4 ohms of error.
Switch to a scale 0 - 2 ohms, and now the same error figure works out to be 0.04 ohms;
and this is small enough not to be significant when reading values around 0.5 ohms.

My Megger BM 100/3 analogue IR tester has a 0 - 2 ohm scale, with error stated to be +/- 2.5 % FSD @ 20 degC.
So for EC measurement, and also for EFLI "dead test" measurement of subcircuits
the max error of 0.05 ohms is small enough not to worry about.

Digital meters
DMMs are what most people buy these days; and are inherently more robust for field work.
Problem is that most people see readings shown to several decimal places, and assume that means the reading is very accurate.
Not true.
Once again look in the specs for your meter, and you'll find that accuracy is stated in 2 parts
1st part is % of the value being measured, and 2nd part is a number of digits (also known as 'counts')
And again accuracy varies slightly with temperature so is stated at a particular ambient temp.
Example: Megger BM 120 IR tester; for continuity ranges (auto-ranging) : 0.01 ohm to 9.99 ohm ±3% ±2 digits.
Looking at the % error, 3% of the value measured; so for 0.5 ohm 3 % = 0.015 ohm.
Looking at the digit error, this refers to the "least significant digit" ; ie the one on the right.
This meter displays to 2 decimal places, so each "digit" of error = 0.01.
Obviously max error is when both these parts are on the same direction (though we won't know whether that's the case on any particular occasion). So adding them, with displayed value = 0.50 ohms, means worst case is a total error of (0.5 x 3%) + (2 x 0.01) = + / - 0.035 ohm.
That's small enough that we can ignore it for the sorts of values we are likely to be measuring for earth continuity or EFLI.

By contrast, compare an ordinary DMM that displays such measurements to only 1 decimal place - which is the most you are likely to find in a DMM that isn't designed primarily as an IR tester.
Good-quality meters such as Fluke & similar will typically tiny % error, which can safely be ignored for purposes of EC measurement.
But their "digits" error is typically 3, sometimes 4 digits. And that's 3 or 4 x 0.1 ohm.

Real-world examples:
Extech 430; resistance 400 ohm range: =/-(0.8% + 4 digits)
Fluke 179; for (lowest) resistance range 600 ohm: resolution 0.1 (ie 1 decimal place); accuracy 0.9% + 2 counts

So with a 3-digit error factor; a displayed value of 0.5 ohm means the actual reading is somewhere between
0.5 - 0.3 = 0.2 (absolutely fine for an MEC); and
0.5 + 0.3 = 0.8 ohm - absolutely not OK.

Even for the Fluke, with only a 2-digit error (ignoring the % error 'cos it's tiny); we're still looking at 0.5 ohm +/- 0.2 ohm

Trouble is, we don't - can't - know any more than the the actual value is somewhere between these extremes.
Making normal DMMs, even if reputable brands and of good quality, pretty hopeless for earth continuity.

Worse still, those "fork" type devices that give no decimal places at all.
The "digits" part of the error will be +/- several ohms, making these devices absolutely useless for earth continuity.
That's OK for telling you whether a switch contact is open or closed; but not much more than that.

These error calcs change slightly for every reading ; because the % relates to % of reading
(unlike analogue, where it's % of FSD so only changes when we change scales).
And the specific error values may well be different for eg voltage than for continuity.

But if you think about it, you'll see that when measuring large values, the % error is more significant; while when measuring small values, the digits error matters most. That's regardless of what sort of measurement (amps, volts, ohms, etc); and regardless of scale used .

Bottom line - to avoid having to do all this calculation for every reading - is that for EC measurement, we need a meter that gives readings to at least 2 decimal places; so that the "digits" error is a few hundredth of an ohm instead of several tenths of an ohm.
Which for all practical purposes, means using the low-range of an IR tester, and not a DMM; because DMMs simply do not give resistance readings to 2 decimal places of an ohm.

And if you get into medical installations, , and especially for "electrical medical equipment"; even an IR tester is not accurate enough, and we need something batter again. Typically several thousand dollars worth.
These users thanked the author AlecK for the post (total 3):
JamieP (Fri Apr 16, 2021 10:22 am) • Mazdaman (Fri Apr 16, 2021 9:57 pm) • t92300 (Sat Apr 17, 2021 9:41 am)
Rating: 50%
JamieP
Posts: 478
Joined: Tue Apr 14, 2020 11:08 am
Has thanked: 92 times
Been thanked: 18 times

Re: Earth Continuity meter suitably

Post by JamieP »

Thank you, once again much appreciated
Post Reply