The need for high-current transducers featuring wide band and high-insulation levels is becoming a more and more impelling need, in the electric power network, as the current distortion increases, due to the proliferation of non-linear, time-variant loads. The Hall-effect current transducers show a good compromise between cost and metrological performance. On the other hand, they may introduce an unacceptable non-linearity error when the input current dynamics is a reduced portion of the transducer full-scale dynamics, and show a large drift of their gain with temperature. The paper proposes a simple method for reduction of the non-linearity error when the input signal is an unbiased sinewave and the automatic calibration of the gain. Experimental results are also reported, showing the method effectiveness.
展开▼