This thesis introduces implementation of mixed-signal building blocks of an artificial neural network; namely the neuron and the synaptic multiplier. This thesis, also, investigates the nonlinear dynamic behavior of a single artificial neuron and presents a Distributed Arithmetic (DA)-based Finite Impulse Response (FIR) filter. All the introduced structures are designed and custom laid out.;A novel VLSI implementation of a reconfigurable neuron based on choosing the minimum operator utilizing the winner-take-all circuit is proposed. The neuron estimates the Sigmoid-shape activation function using the piece-wise linear approximation method and achieves the adaptability by taking advantage of the body effect of PMOS transistors. The structure covers a variety of activation functions such as rectified linear, hard-limit, and different precision sigmoid functions which aims to improve the generalization ability in neural networks.;An area and power-efficient synaptic multiplier is proposed which works based on the combination of the digital gates and weighted current mirrors. A 4-3-2 neural network containing the modular synapse-neuron building blocks is successfully tested for pattern recognition. The proposed artificial neural network addresses the area-efficiency considering the inevitable growth in the size of the current networks.;Moreover, the nonlinear behavior of a single sigmoidal neuron is investigated to discuss the oscillatory behavior of a single neuron and its possible applications in the future generation of oscillators.;The proposed FIR filter is designed aiming to address the efficient VLSI implementation which works based on the distributed arithmetic. There is trade-off between the computation efficiency of the DA-based processing and area-efficiency of multiply-and-accumulate (MAC)-based ones. The proposed FIR filter reduces the required area for a DA-based filter by employing mixed-signal approach. An 8-bit 16-tap FIR filter is designed and successfully tested for a BPF and LPF at 10MHz and 48KHz respectively.
展开▼