White Paper: A Calibration Primer

May 12, 2006

Published Courtesy of Omega Engineering Inc. The most sophisticated industrial equipment will not be very useful unless it is calibrated. Through calibration, adjustments made to a piece of equipment

Published Courtesy of Omega Engineering Inc.

The most sophisticated industrial equipment will not be very useful unless it is calibrated. Through calibration, adjustments made to a piece of equipment ensure that it performs as expected — that it can be relied on to deliver predictable, accurate results that meet quality standards. This white paper from Omega Engineering explains what calibration is, why it is important, and how it works. NIST-traceability is defined and discussed, and there is a step-by-step description of a basic calibration. This paper also discusses in-house vs. laboratory calibration, and it describes major types of calibration devices.

What Is Calibration?
Simply defined, calibration is the process of adjusting a device to meet manufacturer’s specifications. Calibration is sometimes also defined as the issuing of data, including a report or certificate of calibration, that assures an end user of a product’s conformance with specifications, and perhaps also with external guidelines, such as those of the International Organization for Standardization, whose ISO 9001 standards, for example, set worldwide specifications for business sectors. A company follows these standards to ensure that its products and/or services gain acceptance among suppliers and customers. This second definition of calibration is more properly referred to as certification.

A company with equipment needing calibration may send it to a metrology/calibration laboratory, where a skilled technician will either bring it up to specifications or confirm that it already meets them, using measurement/test instruments that must themselves meet strict calibration requirements. All or part of the components used in an industrial process can be calibrated. Atemperature calibration, for example, could involve a probe alone, an instrument alone, or a probe connected to an instrument (a system calibration).

Typically, the accuracy of a calibration device, or calibrator, is at least four times greater than the equipment being calibrated. The equipment usually has a calibration range over which the technician will, at various points, check specifications. As Mike Cable explains in Calibration: A Technician’s Guide, "the calibration range is defined as ‘the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and upper range values.’ The limits are defined by zero and span values. The zerovalue is the lower end of the range. Spanis defined as the between the upper and lower range values."

Adjustments made during calibration must fall within certain tolerances. Such tolerances represent very small, acceptable deviations from the equipment’s specified accuracy.

The manufacturer usually does the initial calibration on its equipment. Subsequent calibrations may be done in-house, by a third-party lab, or by the manufacturer. The frequency of recalibration will vary with the type of equipment. Deciding when to recalibrate a flowmeter, for example, depends mainly on how well the meter performs in the application. If liquids passing through the flowmeter are abrasive or corrosive, parts of the meter may deteriorate in a very short time. Under favorable conditions, the same flowmeter might last for years without requiring recalibration.

As a rule, however, recalibration should be performed at least once a year. Of course, in critical applications frequency will be much greater.

What Are Some Types of Calibrators?
Calibrators vary in form and function with the equipment they are designed to calibrate. A blackbody calibrator, used to calibrate infrared pyrometers (high-temperature thermometers), typically has a target plate with very high emissivity, the temperature of which can be controlled to very narrow tolerances. To calibrate a pyrometer, readings of the target plate taken by the pyrometer are compared with the target plate’s known, controlled temperature. The pyrometer is then adjusted until any difference is so minimal as to be insignificant.

The block calibrator, used for temperature probes, contains a metal block that can be heated to precise temperatures. These temperatures are compared to those taken by temperature probes inserted into the block. Temperature probes generally cannot be adjusted, so this process is one of verification rather than true calibration.

To calibrate equipment such as panel meters and temperature controllers, a device known as a signal reference is often used. It is a type of calibrator that can generate a known electrical signal. There are voltage, current, and frequency signal references. Once a signal from one of these calibrators is fed into the equipment in question, the display or output value of the equipment can be adjusted until it matches the known signal. The simulator, a special kind of signal reference, generates sensor output. Signal references and simulators can often read as well as generate signals.

Because fluidized baths provide safe, rapid heat transfer and accurate temperature control, they are useful in calibrating temperature-sensitive instruments. The fluidized sand bath temperature probe calibrator employs the principle of fluidization that occurs when a gas — usually low-pressure air or nitrogen — flows up through a partially filled chamber or retort containing dry, inert particles of aluminum oxide. The gas flows at low velocity, which sets the particles in motion, separates them, and suspends them to a stable level. This gives the particles an appearance of turbulence similar to that of boiling liquid. Not only do fluidization solids circulate and flow like liquids; they also exhibit excellent heat-transfer characteristics. Temperature probes inserted into the bath come to a stable temperature very quickly, which facilitates calibration.

The thermoelectric cooling elements in an ice point calibration reference chamber produce a very precise, stable temperature of 0 C. Although reference chambers are commonly used to calibrate or verify temperature probes, their ability to simulate a thermocouple signal makes them useful for calibration or verification of instruments that read thermocouples.

Omega Engineering’s metrology lab has a 25-foot wind tunnel with chillers, pumps, and condensers that keep recirculating air at a constant temperature and flow rate. This very large machine is used for anemometer and vane-type sensor calibrations. Benchtop wind tunnels operate on the same principle, producing a highly uniform flowrate across a greatly reduced test section.

Why Is Calibration Important?
On April 12, 1934, an anemometer on the summit of Mount Washington in New Hampshire measured the highest surface wind speed ever recorded: 231 miles per hour. The Mount Washington anemometer, which had been calibrated in 1933, was recalibrated after the world record measurement, and it proved to be accurate. When a typhoon hit Guam in December of 1997, the record seemed to be broken — an anemometer at a United States Air Force base recorded a wind gust of 236 mph. That reading, however, did not stand because the National Climate Extremes Committee judged the Guam anemometer unreliable. To metrology experts, the survival of the old record was an object lesson in the importance of calibration. (A footnote: wind speeds greater than 231 mph are believed to have occurred in tornadoes, but no documentation device has yet withstood such extremes. Contrary to rumor, the Mount Washington anemometer did not blow away during the 1934 storm. Today, it resides in the mountain’s Observatory Summit Museum.)

For a more down-to-earth example of what happens when calibration gets short shrift, one need look no further than the "oven temperatures may vary" statement on the cooking instructions of frozen food. The reason for the caveat is that most household ovens are never recalibrated after they leave the factory. Consumers apparently can live with this margin of error, but it is a different story for manufacturers. For a company to be ISO 9001-compliant, equipment calibration is required. Rules alone, however, do not explain why the best companies recognize the importance of doing calibration correctly and frequently.

Take a company that fabricates metal rods used in the process of its customer. Suppose the customer has specified a rod diameter of one inch — to within a tolerance of plus or minus five hundredths of an inch. If the fabricator verifies the diameter of the rod by measuring it with uncalibrated calipers, can he say with total confidence that he is delivering the rod that the customer has ordered?

Other industries must confront the same facts of life. If a plastics manufacturer does not ensure, through calibration, that liquid plastic enters his injection molding machines at precisely the right temperature, there may be gaps or other defects in the finished product. In the HVAC industry, AC units have some sort of temperature-control instrumentation. If the manufacturer does not periodically calibrate this instrumentation against actual output of the ACs, he will never know whether his equipment fulfills its purpose.

One might ask why sophisticated 21st century technology would ever need to be calibrated. The answer is that virtually all equipment degrades in some fashion over time, and electronic equipment — a mainstay of the manufacturing process — is no exception. As components age, they lose stability and drift from their published specs. Even normal handling can adversely affect calibration, and rough handling, of course, can throw a piece of equipment completely out of whack (though it may appear perfectly fine).

Considering the benefits in quality, productivity, and revenue that a well-designed calibration program can realize, costs are reasonable. Although small calibration tasks can be adequately performed in-house using off-the-shelf calibrators, most medium-to-large companies that view calibration as a high priority opt for the services of an independent metrology/calibration lab. Avery large company might consider investing in automated equipment intended for conducting many daily calibrations, but such equipment — the same kind used by independent labs — is expensive, and it requires skilled technicians. If gaining certification from some outside calibration authority is also necessary, this adds to costs.

What Is NIST Traceability?
The National Institute of Standards and Technology (NIST), part of the U.S. Department of Commerce, oversees the development of measurement standards and technology consistent with the International System of Units (SI). NIST is also charged with imparting these standards to the American system of measurements through calibrations and other services. To help U.S. industry meet international standards, NIST provides, among other programs, laboratory accreditation that enables a business to establish traceability of measurement results.

As NIST defines it, traceability "requires the establishment of an unbroken chain of comparisons to stated references." The links in this chain consist of documented comparisons comprising the values and uncertainties of successive measurement results. The values and uncertainties of each measurement in the chain can be traced through intermediate reference standards all the way to the highest reference standard for which traceability is claimed. According to NIST, the term "traceable to NIST" is shorthand for ‘results of measurements that are traceable to reference standards developed and maintained by NIST.’"

The provider of NIST-traceable measurements may be NIST itself or another organization. NIST reports that each year more than 800 companies tie their measurement standards to NIST. They may then follow these standards in providing measurement services to their customers, in meeting regulations, and in improving quality assurance.

NIST performs calibrations of weights, and though it is not the only provider of this service, it has the distinction of maintaining the national standard for mass. This standard, known as national prototype kilogram K20, resides in a safe at NIST and is virtually an exact copy, allocated to the United States in 1889, of the international prototype, which is kept in a vault at the Bureau International des Poids et Mesures (International Bureau of Weights and Measures) in France. The prototypes are 90% platinum, 10% iridium.

The kilogram is unique among SI base units in that it is still defined by an artifact — the international prototype — while the definitions of other base units refer to basic physical properties. The kilogram’s original definition — the mass of one liter of pure water at 4 C and standard atmospheric pressure—proved too difficult to duplicate in practice.

What Is Involved in a Typical Calibration?
A weighing system serves well in illustrating general principles of calibration. Archimedes and Leonardo Da Vinci used the positioning of calibrated counterweights on a mechanical lever to balance and thereby determine unknown weights. A variation of this device uses multiple levers, each of a different length and balanced with a single standard weight. Later, calibrated springs replaced standard weights.

The introduction of hydraulic and electronic (strain gage-based) load cells represented the first major design change in weighing technology. In today’s processing plants, electronic load cells are preferred in most applications. To check if transducers and load cells are functioning properly, the user must answer the following: Does the weight indication return to zero when the system is empty or unloaded? Does the indicated weight double when the weight doubles? Does the indicated weight remain the same when the location of the load changes (uneven loading)? If the answers are yes, the cells and transducers are probably in good condition.

Before calibration, the mechanical system should be examined and the cell installation checked as follows:

  • Inspect load cell cables, and coil and protect any excess.
  • The load should be equally distributed if the weighing system involves multiple load cells. If differences between cells exceed 10%, the load should be rebalanced and adjusted with shims.
  • During calibration, the user should be able to lift the vessel without unloading or overloading the other cells. The system design should provide for jacking and for the horizontal removal of cells. The calibration of the vessel requires hangers or shelves to support the calibration weights. For an ASME vessel, they must be added when the vessel is fabricated. Calibration to an accuracy of 0.25% full scale or better is usually performed with dead weights and is the only calibration method recognized by weights and measures agencies.

Calibration starts by zeroing the system:

  • During deadweight calibration, evenly load the vessel to 10% of live load capacity using standard weights. Record the weight indication and remove the weights. Next, add process material to the vessel until the weight indicator registers the same 10% weight it did with the calibration weights. Reload the vessel with the calibration weights and record the reading (now about 20%). Repeat these steps until 100% of capacity is reached.
  • Live weight calibration, a novel and faster method, uses pre-weighted people instead of calibration weights. The procedure is identical to deadweight calibration. Never use this method if there is risk of injury.
  • The "material transfer" calibration method uses some other scale to verify weight. This method is limited by the accuracy of the reference scale and risks error due to loss of material in transfer.
  • A master cell can be used for calibration as long as it is three to four times more accurate than the accuracy expected from the calibrated system. This procedure involves incremental loading and the evaluation, at each step, of the output signals of both the calibrated weighbridge and of the master load cell. The number of divisions used and the method of applying the force (hydraulic or servomotor) are up to the user.

If a load cell is causing problems, four tests can be conducted:

  • Mechanical Inspection: Check the load cell for damage. If deformed—bent, stretched, or compressed from its original shape— it must be replaced. Look for distortion or cracks on all metal surfaces. Flexure surfaces must be parallel to each other and perpendicular to both end surfaces. Check the entire length of all cables. Nicked or abraded cables can short out a load cell. and hydraulic tester (right) used in the calibration of load cells and strain gages.
  • Zero Balance (No Load): Residual stresses in the sensing area are the usual cause of shifts in zero balance. These stresses result from overloading the cell or from repeated operation cycles. With a voltmeter, measure the load cell’s output when the cell has no weight. This reading should be within 0.1% of the specifiedbalance tolerance band, the cell is damaged but possibly correctable.
  • Bridge Resistance: Measure the resistance across each pair of input/output leads. Compare these readings against the load cell’s specifications. Failure of one or more elements—typically from electrical transients or lightning strikes— is the principal cause of out-of-tolerance readings.
  • Resistance to Ground: Connect all the input, output, sense, and ground leads, then measure with an ohmmeter the resistance between the load cell body and the leads. The reading should be at least 5,000 megaohms. If the load cell fails this test, do a second test without the ground wire. If it fails again, the cell needs repair, but if it passes, the problem may be the cable. Infiltration of moisture may have short-circuited the current flow between the cell’s electronics and the cell body.

References

  1. Basic Metrology for ISO 9000 Certification, G. M. S. de Silva, Butterworth-Heinemann, 2002.
  2. Calibration: A Technician’s Guide, Mike Cable, Instrumentation, Systems, and Automation Society, 2005.
  3. "Calibration Trends in the 21st Century," Thomas Johnson, Evaluation Engineering, September 2002.
  4. Guide for the Use of the International System of Units (SI), U.S. Department of Commerce, 1995.
  5. An Introduction to Measurement and Calibration, Paul D. W. Campbell, Industrial Press, 1995.
  6. Measurement and Calibration Requirements for Quality Assurance to ISO 9000, Alan S. Morris, Wiley, 1998.
  7. The Metrology Handbook, J. L. Bucher, ASQ Quality Press, 2004.
  8. Traceable Temperatures: An Introduction to Temperature Measurement and Calibration, J. V. Nicholas and D. R. White, Wiley, 2001.

This white paper is published by Flow Control magazine with permission from Omega Engineering Inc. Republication in any form in not permitted without prior written consent from Omega Engineering Inc. (www.omega.com).

Sponsored Recommendations

Clean-in-Place (CIP) Solutions for Life Sciences Process Manufacturing

Learn how Emerson's measurement instrumentation can improve safety and reduce cross-contamination during CIP processes for life sciences process manufacturing.

Wireless Pressure Monitoring at Mining Flotation Cell

Eliminate operator rounds and improve flotation cell efficiency using reliable, wireless technology

Green hydrogen producer ensures quality of the network’s gas blend using a gas chromatograph

Case Study: Revolutionizing Green Hydrogen Blending with Precise Monitoring.

Overcome Measurement Challenges in Life Sciences

See how Emerson's best-in-class measurement instrumentation can help you overcome your toughest life sciences manufacturing challenges.