What is the difference between “accuracy” and “tolerance”?
Accuracy is a specification that describes the precision of an instrument for the purposes of comparing one to another. (Ex: ±0.1% of span)
Tolerance is a function of accuracy. If we think of the accuracy spec as a formula, we can plug in numbers to determine the precision of a specific instrument in terms of its unit of pressure measurement. For example, a 0-1000 PSIG Heise dial-mechanical pressure gauge has an accuracy of ±0.1% of span. If we convert that to a mathematical equation:
Tolerance = 0.001 x span = 0.001 x 1000 = 1 psi
Now when reading the gauge, you will have a confidence that the reading is precise (matches a more accurate pressure standard), within one PSI above or below the reading. If the pointer is pointing to midscale (500 PSI) at 12:00 on the dial, in this example the actual pressure could be from 499 to 501 PSI.
What is the difference between a vacuum and an absolute pressure instrument, and when do I select one vs the other?
A vacuum or “straight vacuum” instrument starts at gauge-pressure zero (at 5 o’clock on a dial gauge) because before connection, both sides of the elastic element are open to atmosphere. The pointer travels counterclockwise as atmosphere is removed, and the scale will read in ascending negative numbers (the higher the value, the more atmosphere that has been removed). You’d choose a vacuum instrument when you want to know how much atmosphere has been removed.
An absolute instrument starts at one atmosphere. That’s because the atmosphere surrounding the elastic element has been removed (called an evacuated reference), causing deflection of the elastic element by the barometric pressure value. Once connected to a vacuum source, the pointer on a dial gauge will travel counterclockwise in descending numbers toward zero. When a “complete” vacuum has been achieved (matching the evacuated reference), the pointer will indicate zero absolute pressure. An absolute gauge tells you how far you are away from a “complete” vacuum. This is especially useful in vacuum chambers.
What is “terminal point” calibration vs “best fit straight line” (BSFL) or “root sum of the squares” (RSS)?
The “Terminal point” method means that every measurement point falls within the tolerance band. BSFL and RSS are statistically derived schemes that average the accuracy of the measurement points, but not guarantee that every point falls within the tolerance band.
All Heise accuracy specs are based on the terminal point method.
Heise is a trusted source for precise pressure measurement instruments around the world. Learn more about how we can help verify your most challenging product requirements where precision matters most.