How Do I Test and Verify a LabJack is Calibrated?

How Do I Test and Verify a LabJack is Calibrated? - LabJack

Is My Calibration Good?

A quick way to check calibration is to take readings from internal ground channel AIN15 or any channel jumpered to GND.  This is the midpoint of the AIN system, so if the device has changed and is out of calibration it will likely show up at this convenient voltage.  Use Kipling or LJControlPanel to see the readings as you control Range and Resolution Index.  The accuracy specifications are the same for the U6 and T7, and the -Pro versions of either: LabJack T7 Noise and Resolution Specs

  • You only need to check one channel as they are all multiplexed to the same AIN circuitry.
  • You do want to check each range of interest (±10V, ±1V, ±0.1V and ±0.01V), as each has unique errors.
  • You do want to check both converters on a -Pro, so ResIndex 8 and 9 on a U6-Pro or T7-Pro.
  • This should be done with the device stabilized at room temperature.
  • To avoid confusion, this should be done with no other connections except comm/power.

NIST Traceable LabJack Calibration Service

Order Calibration

If that ground test shows readings within specifications, you could then proceed to check some other voltages.  Ideally a voltage around 10% and 90% of the range you are testing (e.g. -8 and +8 volts for the ±10 volt range), but a couple voltages from DAC0 (e.g. +2 volts and +4 volts) are fine also.  Use any stable source to provide the test voltage, use a reference voltmeter (must be substantially more accurate than the LabJack specs) to measure the actual voltage, and then use Kipling or LJControlPanel to note the U6/T7 measured voltage.  If all errors are within spec, you (or your calibration lab) can issue a new calibration certificate based on this verification of the current calibration.  A 2-point verification is considered sufficient, as that is the minimum to notice a shift a slope or offset.

Note that verifying / calibrating the smaller AIN ranges can be very difficult because the noise of many voltage sources will be substantial compared to the error band of the U6/T7.  ResolutionIndex can be used to reduce noise, but often additional oversampling and averaging has to be used to determine the average value of the source signal.  You want to use enough oversampling such that the noise (difference between multiple averaged values) is reduced as needed, but want to average over the minimum required time period so you are not introducing actual changes in the source signal.  The noise testing applications can be handy for this.

U3, U12, T4 verification:  Same idea as above, with a couple details.  For unipolar ranges (e.g. 0-2.4 volts), 0.0 volts might not be valid and thus you need 2 voltages in the 10% to 90% range.  Every channel on the U12, and the high-voltage channels on the U3 & T4, have per-channel signal conditioning and thus each must be verified individually.

U12 calibration:  The U12 is calibrated against its own reference voltage, which is a 2.5V signal that appears at the CAL terminal.  The end of Section 3.7 of the U12 Datasheet describes how to do a self-calibration in the field, and in order to call this an absolute calibration you just need to measure the CAL voltage with a proper reference meter and confirm it is 2.49375 - 2.50625 volts.

How Often is Calibration Required?

The industry standard for calibrations is yearly.  The actual requirement will be dicated by your policies or specific regulations/requirements you are following, but short of that yearly can be used as an industry standard interval.

Usually not.  Calibration of an entire system is usually important, but often the specific calibration of the LabJack analog I/O is not needed.  There are usually other errors that are part of the measurement so the best practice is to do a single calibration of the entire signal chain in-situ, rather than calibrating each part separately.  For example, here are some (but not all) error sources that combine when doing a quarter bridge measurement with a strain gauge:

  • Accuracy of the excitation voltage (output is proportional to excitation).
  • Accuracy specification of the strain gauge.
  • Accuracy of the bridge completion resistors.
  • Accuracy of the math relating strain to voltage.
  • Errors due to resistance of connections and wires.
  • Errors due to mounting of the strain gauge.
  • Accuracy of the LabJack A/D conversion.

Note that only 1 of these 7 prominent error sources is from the LabJack.  You could calibrate this system by individually calibrating each of the 7, and then doing a statistical analysis to combine all those errors, but whenever possible it is much better to calibrate the system end-to-end, so in this case relate applied force to voltage read in software.  Say this strain gauge measurement is to measure force applied to a beam.  A great way to handle this is to apply 2 known forces (0 pounds is one easy force to use), note the 2 associated digitized voltages, and then come up with a slope & offset that relate volts to force and apply this in software.

What about the U3 and T4?

The U3 and T4 are calibrated at our facilities to within the specs found in their datasheets, but we do not offer a NIST traceable calibration service for these devices at this time.  A couple reasons in addition to some of the information above:

  • The cost of the calibration service compared to the cost of the device is such that it would not make sense for most customers.
  • The current high-efficiency process used to manufacture and calibrate these lower cost devices does not lend itself to cost efficient generation of NIST traceable certificates.