The World's Largest Online Measuring Instrument Store!
Providing OEM, Wholesale and Retail Services Worldwide. Toll Free :
1-800-717-5818 (U.S. )
Home > Education >Dial indicators Information

Test indicator calibration

The fast, economical and accurate way to calibrate a quantity of test indicators is to invest in a Dial Indicator Calibrator with the Test Indicator attachment. These mechanical devices are available in inch or metric models from several manufacturers. They are in effect a micrometer head with a large 3.5" diameter, .00005" accuracy and 0-1" range. The test indicator is positioned above the spindle using the test indicator attachment. The micrometer head is rotated and readings are compared. It will be necessary to have this unit regularly calibrated by a calibration lab to maintain traceability. Ideally, readings should be taken at every numeral printed on the test indicator dial, or as your quality manual requires.
If you need to calibrate large quantities of analog and/or digital indicators you may want to invest in the electronic i-Checker which is hooked up to a computer system and generates inspection certificates. E-mail us for information on this rather costly apparatus ($8900.00 without computer).
Test indicators can also be calibrated on a surface plate using certified gage blocks. The indicator is securely fastened to a stand and the contact point is brought in contact with a gage block of a given size. The contact point must be parallel with the surface of the block for most manufacturers. Interapid test indicators are an exception and should be at a 12-degree angle, approximately. The gage block can then be removed and replaced a number of times to check for repeatability. Be certain that discrepancies in repeatability are not due to poorly tightened clamps, flimsy stands or other factors. Usually one quarter of a graduation repeatability is allowable, but check with the manufacturer's calibration specs for your particular model.
Errors in repeatability indicate a need for cleaning and, possibly, repair. Do not attempt this without experience.
Accuracy in travel is checked by replacing the gage block with one larger. Very small intervals are required. Ideally you'd want to check the travel at every half revolution, or better. During this procedure be certain that the gage blocks are properly wrung to each other and to the surface. In general, accuracy should not vary more than one graduation per dial revolution on .0005" indicators.
If an incremental error occurs - one which increases regularly over the entire travel - then the contact point is of the wrong length or the angle of the point in regard to the surface is incorrect. You should verify that the correct length point is being used. Furthermore, you can make small adjustments by changing the contact point angle. Make repeated calibration attempts with varying angles until you find one which gives correct results. Obviously, it will now be necessary to recreate this same angle when the indicator is used in actual test situations. Some indicators (Girodtast, for example) allow you to make small adjustments in length with a set screw.
One final method requires a certified height master. This takes the place of gage blocks. The one we use has an accuracy of .00002". The test indicator is firmly fastened to a test stand and the contact point is positioned (at the proper angle) over one of the height master's test surfaces. Comparison readings are now taken at half-revolution intervals - or better - in both directions.
About the cosine error: for test indicators excluding Interapid models. If the contact point can not be kept parallel to the work surface then you will have to make a mathematical adjustment to the dial reading.
Contact point angle correction factor
10° reading times 0.98
15° reading times 0.97
20° reading times 0.94
30° reading times 0.87
40° reading times 0.77
50° reading times 0.64
60° reading times 0.50
From this chart you will notice that a contact point held at a 60-degree angle results in one-half the dial reading. Once you determine the angle, simply multiply the dial reading by the corresponding correction factor.
For example, an indicator reading of .0085" at an angle of 30-degrees is equivalent to
.0085" x .87 = .0074"