top of page

How to determine equipment accuracy for test method specs

Updated: Apr 14


Metrology solutions for manufacturers, testing, & calibration labs.
Metrology Laboratory
 

Abstract:

Often, published test methods give guidance as to the type of test equipment needed to perform a test, but do not always clearly specify the accuracy requirements of test equipment to be used for that test. One might go on a research expedition to determine if other reference exist that provide the guidance on the needed test equipment specifications. So how do we know the appropriate test equipment accuracy for use in a test? For example, a test method might specify an instrument “with readability to 0.01 units”. Does that mean the instrument needs to be accurate to 0.01 units? No! And it can be expensive to buy an instrument to meet an accuracy that is tighter than you really need. Conversely, it can be a waste of money and impart too much risk to get test equipment that is not accurate enough.


Learn how to interpret test method specifications and determine the most appropriate test equipment, and subsequently the calibration needs of the test equipment.


 

1. INTRODUCTION

You’ve been tasked determine the test equipment needed for a new test. The test method might or might not specify tolerance specifications. If it does, how do you interpret them to determine the accuracy needed for the test equipment? If it doesn’t, how do you determine an appropriate accuracy?


If your lab is accredited to or conforms to ISO/IEC 17025, then ISO/IEC 17025:2017 contains clauses that dictate test equipment conformity & accuracy:

  • 6.4.4 “The laboratory shall verify that equipment conforms to specified requirements before being placed or returned into service.”

  • 6.4.5 “The equipment used for measurement shall be capable of achieving the measurement accuracy and/or measurement uncertainty required to provide a valid result.”


How does one know when their test equipment is capable of achieving the measurement accuracy”? There are a few sources of specification tolerances.


2. SOURCES OF SPECIFICATION TOLERANCES

a. Published Test Methods (and where other published documents may be cited therein that include test equipment specifications)

b. Equipment Manufacturer Specifications

c. IEC/IECEE OD-5014; IECEE Operational Document, Edition 1.0, 2016-06-01



3. FORMULAS

Whether it is a test method or equipment manufacturer specifications, tolerances may be presented in a variety of formats. What are different accuracy formats and how are they interpreted and applied?


Percent Accuracy (Percent Error) = (VT-VUUT)/VT * 100

VT = Target Value

VUUT = Unit Under Test Value

Scenario: VT = 50 units, VUUT = 48 units

= ((50 - 48) / 50) * 100

= (2 / 50) * 100

= 0.04 * 100 = 4%


Parts Per Million (ppm): often used for chemical calculations or for combining different units of measurement in an uncertainty budget

1) Expressed as ppm

Scenario: 2 ppm = 2/1,000,000 = 0.000002

2) Expressed as a percent

Scenario: 2 ppm = 0.0002%

2/1,000,000 = 0.000002 * 100% = 0.0002%


Percent of Full-Scale or Percent of Span

The allowable error calculation results in a fixed value based on the full-scale of an instrument.

Scenario: Full scale = 100 °C, Tolerance = 1% full-scale

1% of 100 °C = + 1 °C


Percent of Range

An instrument may have multiple ranges over the span of its capability. Different ranges may have the same of different accuracies. Instrument resolution or sensitivity can play a part in these differences.


Percent of Reading

The allowable error calculation results in different values based on the current reading or applied value.

1% if reading will have different values at different readings:

1% of 50 °C = 0.5 °C

1% of 200 °C = 2 °C


Standard Deviation: a measure of how spread-out numbers are.

The square root of the Variance

Variance = the average of the squared differences from the Mean 1) Population Standard Deviation

2) Sample Standard Deviation


Floor Spec: + (gain error + offset error)

Note: 1 count = 1 visible resolution increment


Types of Floor Specs:

1) + (% reading + counts)

Scenario: Instrument has resolution of 0.01 unit.

· Specification = 5% of reading + 10 counts

· Reading = 15 units

= + (5% of 15 units + 10 counts of 0.01 units)

= + ((0.05 * 15) + (10 * 0.01)) units

= + (0.75 + 0.10)

= + 0.85 units


2) + (% reading + % full-scale)

Scenario: Instrument has resolution of 0.01 unit, full scale of 100 units.

· Specification = 5% of reading + 0.25% full-scale

· Reading = 15 units

= + (5% of 15 units + 0.25% of 100 units)

= + ((0.05 * 15) + (0.0025 * 100)) units

= + (0.75 + 0.25)

= + 1.0 units


3) + (% full-scale + counts)

Scenario: Instrument has resolution of 0.01 unit, full scale of 100 units.

· Specification = 0.5% of full-scale + 10 counts

· Reading = 15 units

= + (0.5% of 100 units + 10 units)

= + ((0.005 * 100) + (10 * 0.01)) units

= + (0.5 + 0.1)

= + 0.6 units



4. DETERMINING ACCURACY BASED ON TEST METHOD SPECIFICATIONS

There are two possibilities for when a test method provides specifications:


Possibility 1: The test method provides the equipment accuracy requirements.

In this case, follow the test method equipment accuracy requirements. Be aware that when a test method states, “…using equipment with resolution (or readability) of 0.001 units”, this does NOT means it must be accurate to 0.001 units; only that it must be readable or have resolution/sensitivity of 0.001 units. Resolution is not the same as accuracy and in very few cases is the accuracy = resolution.


For example, a digital balance has resolution or sensitivity to 0.001 g. As this balance is calibrated, factors contributing to its calibration uncertainty include at least some of the following; reference standard calibration uncertainty, reference standard drift, balance resolution, balance eccentricity, environmental conditions, repeatability, buoyancy correction, etc. When all of these contributors are factored, there is no possibility that the balance will be accurate to + 0.001 g. The accuracy will be more likely ~ + 0.010 g.


Possibility 2: The test method provides a tolerance specification for the test, but not for the test equipment.

In this case, one can use the guidance of TUR 4:1 in determining the appropriate test equipment. In no case should the TUR <1:1, because when TUR <1:1, this means that the calibration uncertainty is larger than the accuracy specification. In this instance, one can not have confidence that the instrument will ever be in tolerance.


TUR = Test Uncertainty Ratio = Test Accuracy Specification

Equipment Measurement Uncertainty

NOTE: The equipment measurement uncertainty is the expanded uncertainty where k=2 and approximately 95% level of confidence.


EX 1: Test Accuracy specification = + 2 units, Equipment Measurement Uncertainty = 0.4 units. TUR = 2 / 0.4 = 5

This TUR is 5:1


EX 2: Test Accuracy specification = + 2 units, Equipment Measurement Uncertainty = 1.3 units. TUR = 2 / 1.3 = 1.54

This TUR is 1.54:1


EX 3: Test Accuracy specification = + 2 units, Equipment Measurement Uncertainty = 1.7 units. TUR = 2 / 1.7 = 1.17

This TUR is 1.17:1



NOTE: TUR is different than TAR (Test Accuracy Ratio) as TAR is based solely on accuracy and does not consider potential error sources during calibration. TUR is more often used that TAR.


Test Accuracy Ratio = Test Accuracy Specification

Test Equipment Accuracy



5. APPLICATION of IEC/IECEE OD-5014; IECEE Operational Document, Edition 1.1, 2019-06-01

IEC/IEEE OD-5014 “provides default instrument accuracy requirements where the test standard does not provide criteria.”

6. SUMMARY

Regardless of where accuracy specifications come from, it is important to know how to interpret and apply them. Whether the test method provides test equipment accuracy requirements, or one has to use the IEC/IECEE OD-5014, one must make certain that the test equipment is accurate enough for use to ensure valid results and calibrated before use.



7. REFERENCES

IECEE OD-5014; IECEE Operational Document, Edition 1.1, 2019-06-01 (this replaces CTL Decision Sheet (DSH) 251, including CTL Decision Sheet No. DSH 251E (2014))

ISO/IEC 17025-2017 General Requirements for the Competence of Testing and Calibration Laboratories.


1,206 views

Recent Posts

See All
bottom of page