In this paper we propose feedforward neural networks (NNs forshort) tolerating multiple-valued stuck-at faults of connection weights.To improve the fault tolerance against faults with small false absolutevalues, we employ the activation function with the relatively gentlegradient for the last layer, and steepen the gradient of the function inthe intermediate layer. For faults with large false absolute values, thefunction working as filter inhibits their influence by setting productsof inputs and faulty weights to allowable values. The experimentalresults show that our NN is superior in fault tolerance and learningtime to other NNs employing approaches based on fault injection,forcible weight limit and so forth
展开▼