Overview of Instrument Calibration Process
Many people do a field comparison check of two instruments, and call them “calibrated” if they give the same reading. This isn’t calibration. It’s simply a field check. It can show you if there’s a problem, but it can’t show you which meter is right. If both meters are out of calibration by the same amount and in the same direction, it won’t show you anything. Nor will it show you any trending – you won’t know your instrument is headed for an “out of cal” condition. Instrument Calibration is a form of process for quality assurance. You know the value of testing electrical equipment, or you wouldn’t have test instrumentation to begin with. Just as electrical equipment needs testing, so do you’re your instruments.
For an effective calibration, the calibration standard must be more accurate than the instrument under test. Most of us have a microwave oven or other appliance that displays the time in hours and minutes. Most of us live in places where we change the clocks at least twice a year, plus again after a power outage. When you set the time on that appliance, what do you use as your reference timepiece? Do you use a clock that displays seconds? You probably set the time on the “digits challenged” appliance when the reference clock is at the “top” of a minute (e.g., zero seconds). A metrology lab follows the same philosophy. They see how closely your “whole minutes” track the correct number of seconds. And they do this at multiple points on the measurement scales.
What is Calibration Process
The calibration process of instruments measurements needs control. In fact calibration itself can be regarded as a process to establish control on instruments and get confidence.
The set of operations which establish, under specified conditions, the relationships between values indicated by a measuring instrument or system or values represented by a material measure or reference material; & the corresponding values of a quantity realised by a reference standard is called calibration.
Calibration process involves checking the operational integrity of a test or measuring equipment or of a measurement standard of unverified accuracy in order to detect, correlate, report or eliminate (by adjustment) any deviation in accuracy, capability or from any other required performance. Calibration can be carried out for three possible purposes.
1. Determining whether or not a particular instrument or standard is within some established tolerance in respect of its deviation from a reference standard.
2. Reporting of deviations in measurements from nominal values.
3. Repairing / adjusting the instrument or standard to bring it back within tolerance.
Why Calibration Process is Essential
* Controlling the process, relation between inputs & output can be controlled
* To control the process, we need to know the status of the process
* Measurement gives information regarding status
* Control of a process can never be better than the measurements made in the process
* The more accurate the data obtained from the process, with more accuracy the process can be controlled.
Causes of Calibration Problems
What knocks a digital instrument “out of cal?” First, the major components of test instruments (e.g., voltage references, input dividers, current shunts) can simply shift over time. This shifting is minor and usually harmless if you keep a good calibration schedule, and this shifting is typically what calibration finds and corrects.
But, suppose you drop a current clamp-hard. How do you know that clamp will accurately measure, now? You don’t. It may well have gross calibration errors. Similarly, exposing a DMM to an overload can throw it off. Some people think this has little effect, because the inputs are fused or breaker-protected. But, those protection devices may not trip on a transient. Also, a large enough voltage input can jump across the input protection device entirely. This is far less likely with higher quality DMMs, which is one reason they are more cost-effective than the less expensive imports.