To solve the issue of catastrophic forgetting in artificial neural networks, a simple, effective and novel solution called neuron-level plasticity control (NPC) is proposed. The proposed method preserves the existing knowledge by controlling the plasticity of the network at the neural level rather than the connection level while learning new tasks. Neuron-level plasticity control integrates important neurons by evaluating each neuron for importance and applying a low learning rate. An extension of NPCs called scheduled NPCs (SNPCs) is also proposed. This extension uses learning schedule information to more explicitly protect critical neurons. Experimental results on the incremental MNIST (iMNIST) and incremental CIFAR100 (CIFAR100) datasets show that NPC and SNPC are significantly more effective than the connection-level integration approach, and SNPC in particular shows excellent performance on both datasets.
展开▼