Monday, April 29, 2013

When should you calibrate your test equipment?

Today I'd like to talk about calibration intervals and how often you actually need to calibrate your equipment. If you are going by the datasheet, most instruments will offer 1 year specifications or maybe 2 years. Is this the recommended cal interval? Why calibrate at all? What the heck is calibration? We'll discuss these questions in today's post. The last three products that I worked on were DMMs, function generators, and counters. My calibration knowledge is skewed towards those types of instruments.

Technically, calibration is defined (in Wikipedia) as a comparison between measurements – one of known magnitude or correctness made or set with one device and another measurement made in as similar a way as possible with a second device. In the test instrument world, this boils down to checking your instrument vs. a traceable standard.

In a more practical sense, I've always considered calibration as having two parts: verification and adjustment. Verification is checking the measurement against a specification i.e. checking your results and comparing them to a datasheet spec. Adjustment is the act of changing the performance of your instrument to be within specification. You don't always need to do an adjustment if your unit is verified to be within specifications, but you might want to depending on your situation.

We're going to discuss two typical use cases when deciding whether or not to calibrate your instrument:
A) You have the instruments in your control i.e. your bench and absolute accuracy isn't your main concern. 
B) The instrument is part of a production system and you rely on results from it for manufacturing

Let's examine the simplest use case first, bench instruments. I'm sure that many of you have equipment on your bench or desk that have not been calibrated for years. They seem to work just fine and you're probably thinking "why would you mess with a good thing?" and you're probably right. Instruments, like a DMM, stabilize the longer that they have been powered on. The mechanical stresses that contribute to reference drift (like thermal expansion and annealing) have settled over time. So how do you determine if you need to calibrate or not? In my experience, the accuracy issue comes out when you are comparing measurements between benches. I measure an output on my bench, but my colleague's measurement is different enough to cause us to question both of our instruments. It's probably time to calibrate (adjust) one of our instruments.

Let's take it a step further and try to figure out which one to calibrate. The easiest answer is to compare the measurements of both instruments against something that is known to be more accurate than our instruments. If we have a 34401A, we can compare the results vs. the output of a Fluke 5720A which is the calibrator of choice for 6.5 digit DMMs. But if you don't have a calibrator available, you might compare the readings to a 3458A. If you don't have one of those, perhaps you can compare the readings against a third DMM that has been calibrated recently. Its all about comparison of measurements in this world. You want to try and compare against something that is known to be more accurate than your instrument. It is sometimes tough to find, but you need a guiding light to show you the way. Comparing against something better will get you there.

There are bench users that do need a regularly calibrated instrument for their work. This typically consists of highly accurate work that will end up in a report or determining a datasheet. In these cases, you need to ensure that your measurements are accurate and calibration is the only way to check your accuracy. In our lab, we do a lot of relative measurements. Calibration affects the absolute accuracy, but not resolution in most cases. Our engineers are able to do quite a bit of work without calibration since all they worry about is the relative accuracy in most measurement. For the final qualification, we use a calibrated system to determine final accuracy and traceability.

Now let's tackle the more complicated production test use case. In one sense it is more simple, you need to calibrate your instrument on a regular cycle. Calibration is the only way that you can ensure that your instrument is providing the accuracy that it is specified to. However, the interesting note here is that calibration only checks that the instrument is in specification during that measurement. It does not necessarily guarantee that your instrument will be in spec for the next calibration cycle. The only way to ensure that your instrument is performing to spec is to compare results between calibrations. It is entirely possible that your instrument could be out of specification shortly after calibration and you wouldn't find out until after the next calibration. If you are out of spec, it would put any measurements between calibrations into question.

Good instrument manufacturers will have characterized the drift of their references and have done their homework on the design to ensure that the instrument design performs to specification or better. Less reputable designers might not have put in the time and engineering to characterize their design long term. In both cases, it is possible that manufacturing mistakes could cause a latent failure or drift once the instrument is in use after a few months. Calibration is really the only way to ensure that your instrument is performing to your expectations.

Determining the calibration interval is dependent on your needs. Most of you will not want to calibrate every 90 days due to the cost. As the user of the product, you need to determine which set of specifications will allow you test to the accuracy that you need. It is recommended that you set up a shorter interval test for yourself to ensure that your instrument is not out of specification between your calibration cycle. The worst case scenario for this would be that you need to recall all of your devices that were tested between calibrations because you didn't know when the instrument went out of spec.

Calibration is an essential detail of instrumentation. Understanding the term and figuring out what actions you should take will make your instrumentation life easier in the long run.

In my next post I'll discuss some different types of calibration methods including shorter interval testing.

Wednesday, April 3, 2013

Video Review of 33500B Series Waveform Generators

Hi there, I'm David. I'm a new contributor to this blog and I hope to help Neil update content one this blog fairly regularly. I've been a test engineer for about 13 years now and I figured that I could help others understand interesting features, issues, and tribulations that are found with test equipment.

For my first post I'm going to direct you to an independent video review of Agilent's 33500B Series Waveform Generators. The 33500B series was introduced in 2012 and people are starting to get their hands dirty with them. This review was published by Watt Circuit Blog which is run by a couple of engineers in England. In this long running video review, the blogger uses the 33522B in their demonstration. He covers many aspects of the 33500B series ranging from unboxing the unit to using it to characterize an IC. A write up with pictures is published on Element 14. He likes it.

Since its a long video, I'm going to link directly to a few points that might be of interest to you.

Unboxing the unit - This section includes first impressions and a front and rear panel walk through. 
Powering up the unit - Using the front panel by setting up a multi-tone frequency modulated signal.
Adding noise to a signal - using summing modulation and visualizing it on a scope
Replacing a pulse generator - setting up fast edge times, phase control between two channels, and pulse bursts.
Setting up a frequency sweep - Using a scope to  to check frequency response of a filter. This uses the frequency sweep function of the generator
Using a function generator to characterize circuit performance - Testing a level shifter IC with a PBRS signal. Characterizes the IC by sweeping frequency and adding noise.
Benchlink SW - discusses and demonstrates the free Agilent Benchlink software to generate your own waveforms.
Importing a .CSV file - demonstrates how to import a comma separated (.csv) file using a USB stick.
LAN connectivity - LAN and web user interface to control the generator
Summary - Gives a nice summary and discusses Trueform technology, robustness of the unit, discusses upgrades, and mentions that the review only scratches the surface of features of these generators.