Growth Trends for Related Jobs
How to Calibrate a Starrett Micrometer
Calibrating your Starrett micrometer is essential when you are working on very precise parts in a machine shop. Tolerances are very small and note the maximum deviation allowed from the measurements on a blueprint. If your measuring tool is off by more than that amount and you are not aware, you may be wasting your time and the shop's money on bad parts. Calibration allows the user to know that his Starrett micrometer is accurate and acceptable for proofing parts that come out of the machine.
Check to see how accurate the micrometer is reading presently befor calibrating. Close the spindle of the Starrett micrometer by turning the thumbwheel clockwise until the spindle front touches the anvil of the micrometer. Check the reading on the sleeve and thumbwheel. The line on the sleeve should be lined up with the zero line on the thumbwheel. Make sure to check for gaps between the anvil and spindle before adjusting and wipe the surfaces as necessary.
Adjust the nut on the micrometer using the C-spanner wrench. Turn the nut in either direction to line the sleeve line up with the zero line on the thumbwheel. Do not force it, if it is off more than .002 of an inch, you will have to have it repaired. Once you have adjusted it to read zero, open the spindle by turning the thumbwheel counterclockwise and back to zero three time to make sure that the zero repeats.
Use a .25 of an inch gage block to calibrate the micrometer. Open the micrometer spindle just past .25 of an inch. Slide the block in between the spindle and anvil and move the spindle back so that it is touching the gage block. Look at the reading. If it is dead on at .25 of an inch, move to Step 4. If not, use the C-spanner wrench to adjust the micrometer so that it reads .25 accurately.
Repeat the measurement of the gage block two more times. If the measurement is off by more than .0005 of an inch more than once, the micrometer cannot be calibrated in this manner and must be repaired. Make sure that the surfaces of the micrometer and the gage block are clean and free from debris that may cause a false inaccurate reading.
Measure .5 and .75 inch gage blocks as in Steps 3 and 4 to assure that measurements along the entire inch are accurate. Sometimes, a micrometer can read .25 of an inch accurately, but has some issues closer to the one inch mark. This could be from damage and lack of calibration through the years. Properly calibrating a measuring tool includes random measurements along its entire measuring range.
Writer
Christian Mullen is a graduate from the University of Central Florida with a bachelor's degree in finance. He has written content articles online since 2009, specializing in financial topics. A professional musician, Mullen also has expert knowledge of the music industry and all of its facets.
Photo Credits
ampols/iStock/GettyImages