Calibration terms – Coating thickness gauge

The following discussion provides definitions, interpretations, limitations, and practical examples of metrological terms related to the DeFelsko coating thickness gauge. The resources used to prepare this document mainly include technical articles and standards published by international organizations such as SSPC, ISO, ANSI, and ASTM. The aim was to develop a common reference platform for DeFelsko documents, including literature, manuals, technical articles, communications, and web materials.

Type 1-Pull-Off thickness gauge
In a Type 1 Pull-Off (PosiTest or PosiPen) thickness gauge, the permanent magnet is in direct contact with the coated surface. The force required to pull the magnet off the surface was measured and interpreted as the coating thickness value displayed by the scale or meter. The force holding the magnet on the surface is inversely proportional to a nonlinear function of the distance between the magnet and the steel, that is, the thickness of the dry coating. Less force is required to remove the magnet from the thick coating.

Calibration terms – Coating thickness gauge

Type 2 – Electronic thickness gauge
The Model 2, PosiTector, uses an electronic circuit to convert a reference signal to coating thickness. The electronic iron measuring instrument operates according to two different magnetic principles. Some use permanent magnets that, when close to steel, increase the flux density on the polar surface of the magnet. The coating thickness is determined by measuring this change in the magnetic flux density, which is inversely proportional to the distance between the magnet and the steel substrate. Hall elements and reluctance elements located at the polar surface are common methods for measuring such flux density changes. However, the response of these components depends on temperature, so temperature compensation is required.

Other iron electronic thickness gauges operate on the principle of electromagnetic induction. A coil containing a soft iron rod is powered by an AC current, which creates a varying magnetic field at the probe. As with permanent magnets, the flux density in the rod increases when the probe is close to the steel substrate. This change is detected by the second coil. The output of the second coil is related to the coating thickness. These strain gauges also require temperature compensation due to the temperature dependence of the coil parameters.

Characterization is a process by which an instrument is taught to associate the signal received from its probe tip with the actual coating thickness measurements. The result of the characterization process is the calibration curve built into the instrument. Based on the complexity of the curve, it may also include allowances for other effects, such as ambient temperature.

Each DeFelsko instrument is individually characterized using measurements made with traceable calibration standards that cover the full range of the instrument. It is this feature that enables the DeFelsko instrument to make meaningful measurements directly for most applications.

Standard of reference
The reference standard is a sample of known thickness against which the user can verify the accuracy of his thickness gauge. The reference standard is usually the coating thickness standard or the gasket. If the contract parties agree, the sample portion of known (or acceptable) thickness may be used as the thickness standard for a particular job.

Calibration terms – Coating thickness gauge

Coating thickness standard
For most instruments, the coating thickness standard is usually a smooth metal substrate with a non-magnetic (epoxy) coating of known thickness traceable to the National Standard (NIST). The substrate used for magnetometers is iron (steel) or non-ferrous metal (aluminum) used for eddy current meters. High tolerance coating thickness standards are used to characterize and calibrate thickness gauges as part of the manufacturing process. Customers can purchase the same standard and use it as a calibration standard in a calibration laboratory or as an inspection standard in the field or on the factory floor.

The coating thickness standard used with ultrasonic thickness gauges is solid plastic (polystyrene) blocks that have been machinated to a flat and smooth surface. In addition to a known thickness that can be traced back to national standards, these standards also have a known speed of sound.

Calibration standards are purchased as accessories to help meet the growing needs of customers to meet ISO/QS-9000 and internal quality control requirements. Many customers find it more practical to calibrate their own thickness gauges in-house rather than using DeFelsko’s calibration service. For the convenience of these customers, calibration standard sets of nominal values are provided to cover the entire range of each DeFelsko thickness gauge. All standards come with calibration certificates showing traceability to NIST.

Air spacer
Gaskets are flat non-magnetic (plastic) parts of known thickness. Although the gasket is usually able to take the form of the substrate to be tested, the precision of the gasket is more limited than the coating thickness standard. Therefore, when calibrating and adjusting a Class 2 (electronic) thickness gauge with a gasket, it is important to combine the tolerance of the gasket with the tolerance of the thickness gauge before determining the measurement accuracy.

Use of gaskets with type 1 (mechanical break) thickness gauges is not recommended. Such gaskets are usually quite stiff and curved and will not lie completely flat even on a smooth steel test surface. Near the pull point measured by the mechanical thickness gauge, the gasket often bounced off the steel surface, causing the magnet to lift prematurely and leading to false readings.

Calibration is the controlled and documented process of measuring traceable calibration standards and verifying that the results are within the specified accuracy of the thickness gauge. Calibration is usually performed by the thickness gauge manufacturer or qualified laboratory in a controlled environment using a documented process. The coating thickness standard used in the calibration must be such that the combined uncertainty of the measurement result is less than the accuracy specified by the thickness gauge.

Interval of calibration
The calibration interval is the given time period between instrument recalibuations. As required by ISO 17025, DeFelsko does not include calibration intervals as part of the calibration certificates issued by our PosiPen, PosiTest, PosiTector 6000 and 100 coating thickness gauges.

For customers seeking help in formulating their own calibration intervals, we share the following experience. Factors not related to shelf life are more critical in determining the calibration interval. These factors are mainly frequency of use, relevant application and degree of care during use, handling and storage. For example, customers who use the thickness gauge frequently, measure on an abrasive surface, or use the thickness gauge roughly (i.e., drop the thickness gauge, fail to replace the cover of the probe tip for storage, or frequently throw the thickness gauge into the toolbox for storage) may require relatively short calibration intervals. From theoretical analysis and practical experience, the influence of temperature and humidity on strain gauges is very small. In addition, the manufacturing process is designed to minimize post-calibration variation in thickness gauge performance to a greater extent. Even in the case of drift, the thickness gauge measurements are usually linear and are therefore compensated by the “zero” function of the thickness gauge before use.

Although DeFelsko advised customers to establish thickness gauge calibration intervals based on their own experience and work environment, customer feedback suggested one year as a typical starting point. Moreover, our experience suggests that customers who purchase a new instrument can safely use the instrument purchase date as the beginning of their first calibration interval. The smaller effect of shelf life reduces the importance of the date of the actual calibration certificate.

Calibration terms – Coating thickness gauge

Certificate of calibration
A calibration certificate is a document that records the actual measurement results and all other relevant information to successfully calibrate the instrument. DeFelsko included a calibration certificate with each new, recalipated or repaired instrument, clearly indicating traceability to national standards.


Traceability is the ability to follow the results of measurements through an uninterrupted chain of comparisons, all the way back to a fixed international or national standard that is accepted as correct. The chain typically consists of several appropriate measurement criteria, each with a value that is more accurate and less uncertain than its subsequent criteria.

Recalibration (recertification)

Recalibration, also known as recertification, is the process of calibrating a used instrument. Periodic re-calibrations are required throughout the life of the instrument because the probe surface is subject to wear and tear, which may affect the linearity of the measurement.

In theory, customers who have a traceable thickness reference standard and a copy of the calibration procedure available from the DeFelsko website can recalipay their own thickness gauge. Customers are also constrained by their own quality system requirements as well as their ability to control the conditions for recalitioning.

Verification (calibration verification)

Calibration verification is an accuracy check performed by the instrument user against a known reference standard covering the expected coating thickness range. This process is designed to verify that the thickness gauge is still operating as expected.

Validation is usually performed to prevent measurements with an inaccurate thickness gauge at the beginning or end of a shift, before critical measurements are taken, when the instrument is dropped or damaged, or whenever a false reading is suspected. The parties to the contract may reach a preliminary agreement on the details and frequency of verifying the accuracy of the thickness gauge if they see fit. If the reading does not agree with the reference standard, all measurements taken since the last accuracy check are suspect. In the event of physical damage, wear, high usage, or after established calibration intervals, the thickness gauge should be discontinued and returned to the manufacturer for repair or calibration. The use of inspection measurement standards is not a substitute for regular calibration and confirmation of instruments,

Calibration terms – Coating thickness gauge

Calibration and adjustment (adjustment, optimization)

A calibration adjustment is the calibration of the thickness reading of a thickness gauge (eliminating bias) to match the thickness reading of a known sample to improve the accuracy of a thickness gauge on a specific surface or a specific part of its measurement range.

In most cases, it is only necessary to check the zero position on the uncoated substrate and start the measurement. However, the influence of characteristics such as substrate (composition, magnetism, shape, roughness, edge effects) and coating (composition, surface roughness) as well as environmental and surface temperature may require adjustment of the instrument.

Most Type 2 thickness gauges can be adjusted to a known reference standard, such as coated parts or gaskets. However, type 1 thickness gauges such as PosiPen and PosiTest have nonlinear scales. Since their adjustment function is linear, no adjustment should be made. Instead, users should take base metal readings (BMR).

For type 2 thickness gauges where no calibration adjustment method is specified, a 1-point calibration adjustment is usually performed first. If inaccuracies are encountered, a 2-point calibration adjustment should be performed.

1 point calibration adjustment

The 1-pt calibration adjustment involves fixing the calibration curve of an instrument at a point after multiple readings of a known sample or reference standard. If desired, the gasket can be placed on the bare substrate to establish such thickness. This adjustment point can be located anywhere within the measuring range of the instrument, but a position close to the expected measuring thickness should be chosen for better results.

Zeroing is a simple form of 1-point adjustment. It involves the measurement of uncoated samples or plates. In a simple zero calibration adjustment, one measurement is taken and then the reading is adjusted to zero. In a mean zero calibration adjustment, multiple measurements are taken, and then the thickness gauge calculates the mean reading and automatically adjusts this value to zero.

2 point calibration adjustment

The 2-pt calibration adjustment is similar to the 1-pt except that the calibration curve of the instrument is fixed at two known points after multiple readings of a known sample or reference standard. These two thicknesses must be within the measuring range of the instrument. The point is usually chosen on either side of the expected coating thickness. One advantage of the PosiTector 6000 is its accuracy across its measurement range. This generally makes zero (uncoated) one of the two points used in the 2-point calibration.

Calibration terms – Coating thickness gauge

Basic metal reading

Base metal reading (BMR) is a zeroing technique used in Class 1 (mechanical pull) thickness gauges on rough surfaces. The adjustment to the Class 1 thickness gauge is linear in nature, but the scale of the thickness gauge is nonlinear. Therefore, it is important not to adjust the thickness gauge to a zero reading on the bare substrate. It is recommended to calculate the BMR for the uncoated part and subtract it from the actual reading obtained for the coated part. BMR is calculated as the representative value (average) of multiple measurements taken from multiple locations on the bare substrate.

Degree of roughness

If the steel surface is smooth and flat, the surface plane is the effective magnetic surface. If the steel is roughened, such as by sandblasting, the “apparent” or effective magnetic surface detected by the thickness gauge is the imaginary plane located between the peaks and valleys of the surface profile. The thickness gauge reads the thickness above the imaginary magnetic plane. If a type 1 thickness gauge is used, the coating thickness above the peak is obtained by subtracting the substrate metal reading. Using a properly adjusted Type 2 thickness gauge, the obtained reading directly indicates the coating thickness.

Share this post