7 Steps to Ensuring Your Measured Results Are In Spec

April 2, 2019 Joshua Jablons Ph.D.

7 Steps to Ensuring Your Measured Results are in Spec

Is Calibrated Measuring Consistent Measuring?

Across our industry, it’s a given that customers, suppliers of materials, and parts manufacturers all use calibrated devices to take measurements. A device is calibrated in order to:

  • Ensure that readings are consistent with other measurements — that is, the device’s measurements are compared with and traceable to a known and accepted standard
  • Determine the accuracy of the device’s readings
  • Establish the reliability of the device

But just because we all use calibrated tools does not guarantee that everyone measuring the same part will get the same results.

How can this be? For starters, even when you are using NIST traceable, calibrated measuring devices, within each device there is a tolerance, indicating accuracy within plus or minus X amount. What’s more, the tool used to calibrate that device — for example, a pin — also has its own tolerance. Add to that the tolerance of the tool used to calibrate the pin … You get the idea.

The good news is, any manufacturer worth its salt takes tool tolerances into account when determining how to achieve dimensions that are within your specifications. Tolerances aside, there are other factors — easily controllable ones — that can have an impact on calibrated measuring and whether the end results meet your specs.

Success is in the details.

There are some simple “rules” that can help to ensure that calibrated measuring yields accurate and consistent results — especially with very close tolerances, where a slight discrepancy in measurement can mean the difference between in spec and out of spec.

It may seem elementary, but it bears mentioning:

1. Make sure you and your supplier or manufacturer are using the same type of device to do the measuring.

For example, if a manufacturer verifies a specified ID using optical measurement but you inspect the finished parts using a pin gage, there may be discrepancies in the results. If you are measuring length, will you be using a caliper, a micrometer, or a ruler?

2. Make sure that you and your supplier/manufacturer are using measuring devices that are correctly cross-calibrated.

Otherwise, the two devices may measure a part differently. A Z pin gage versus a ZZ pin gage will not yield the same results. If you are using a digital micrometer, be sure you are using the same device calibrated in the same units to the same standard, and rounded to the same number of decimal places.

3. If possible, provide your manufacturer with the measuring gage or other device you plan to use to verify the dimensions of your parts.

For example, for a functional test requiring a go/no go gage, you can send a copy of your calibrated pin or screw gage to your vendor. Where more complex testing is required for a high volume of work, your vendor may be willing to purchase the same calibrated measuring device to have on-site for use with your production runs.

4. The method of measurement, not just the device to be used, should also be specified.

Have a pre-production discussion with your vendor and provide detailed instructions on how measurements should be done. For instance:

For an ID, at what point of the diameter you will be measuring — the high point? The low point? An average?
If you are using a tool such as a handheld digital micrometer, to how many decimal places are you both measuring? How many of those decimal places are valid (i.e., all 0-9 values)? And finally,is the last decimal place rounding (i.e., cycling between 0 and 5)?

5. Make sure all your measurement requirements are included in your drawing — and that the details of your drawing don’t conflict with your measurement method.

For example, if you provide a go/no go gage but your drawing calls for a certain pitch in a screw part, your manufacturer may have to reconcile parts that pass the go/no go test but if done to the drawing specifications, don’t pass the pitch test — or may pass the test if measured from one point but not another.

6. Carefully consider what you really need to measure.

Avoid the risks of over-engineering — such as more rejected parts, increased waste, and higher cost — by distinguishing between critical and non-critical dimensions. Ask yourself, do you really need a very tight tolerance on dimension X? Or will any measurement that fits within a highest point and lowest point (± tolerance) suffice?

Equally important, in the quoting and specification stage, be sure to identify for your vendor your truly critical dimensions and how they will be measured, so you will all be on the same page from day one.

7. Where practical for your inspection needs, use a functional test.

A simple go/no go test with a pin or screw gage may be a perfectly acceptable measure of quality and consistency, and be more cost-effective than specifying a particular tolerance.

What can we do if our measurements still don’t agree?

When there is a discrepancy between a manufacturer’s measurement results and what you get when you inspect the finished parts, often the issue can be resolved simply by re-measuring. For instance, a measurement error such as parts turning when they were initially measured, causing their diameters to be out of round, might be detected upon re-measuring and finding those same parts to now all be within spec.

If the source of a measurement discrepancy cannot be identified, a formal correlation study can be performed, using the best practices of Gage R&R methodologies. This generally involves taking a sample of 30 numbered pieces and having three individuals at the manufacturer’s end measure and record each piece’s dimensions, and then repeating the process with the same sample at the customer’s end.

If a correlation study does not reveal a measurement device problem or method error as the cause of the discrepancy, the manufacturer may need to cut the pars at a closer tolerance than what the drawings specifies, in order to achieve the required results according to the customer’s calibrated measuring device and method.

Fortunately, at Metal Cutting Corporation we find that the need to perform a correlation study rarely arises. By keeping a close handle on tolerances across the production and inspection processes and, as much as possible, by manufacturing every part to its nominal dimension, we are able to reliably achieve measurements that are within specifications.

For your part, as a customer you can also help to ensure success by following the rules we set out above. By working up front with your supplier or parts manufacturer to define how the dimensions of your parts should be measured and agree on the calibrated device and measurement procedure to be used, you can greatly improve the odds that your parts will be in spec.


Previous Video
Fascinating Facts About Eddy Current Testing
Fascinating Facts About Eddy Current Testing

Eddy current testing is an important method of nondestructive testing. See what you may not have known abou...

Next Article
What Is Calibration Tolerance?
What Is Calibration Tolerance?

The key to calibration tolerance is understanding both what a device is capable of doing and the tolerance ...

Get a customized quote, today!

Request A Quote