An approach to optimizing the testing of analog and mixed-signal devices is presented. Once an accurate model has been developed, simple algebraic operations on the model can be used to select an optimum set of test points that will minimize the test effort and maximize the test confidence; estimate the parameters of the model from measurements made at the selected test points; predict the response of the device at all candidate test points as a basis for accepting or rejecting units; calculate the accuracy of the parameter estimates and response predictions on the basis of the random measurement error; and test the validity of the model, online, so that changes in the manufacturing process can be constantly monitored and the model can be updated. The authors show how each of these procedures can be performed using simple calls to routines that are available in both public domain and commercial linear algebra software packages. The approach is quite general and has been experimentally applied to the measurement of the frequency response of an amplifier-attenuator network, to fault diagnosis of a bandpass filter using time-domain measurements, and to efficient linearity tests of A/D and D/A converters.
展开▼