One of the major open issues in neural network research includes a Network Designing Problem (NDP): find a polynomial-time procedure that produces minimal structures (the minimum intermediate size, thresholds and synapse weights) of multilayer threshold feedforward networks so that they can yield outputs consistent with given sample sets of input-output data. The NDP includes as a subproblem a Network Training Problem (NTP) where the intermediate size is given. The NTP has been studied mainly by use of iterative algorithms of network training. This paper, making use of both rate distortion theory in information theory and linear algebra, solves the NDP mathematically rigorously. On the basis of this mathematical solution, it furthermore develops a mathematical solution procedure to the NDP that computes the minimal structure straightforwardly from the sample set. The procedure precisely attains the minimum intermediate size, although its computational time complexity can be of nonpolynomial order at worst cases. The paper also refers to a polynomial-time shortcut of the procedure for practical use that can reach an approximate minimum intermediate size with its error measurable. The shortcut, when the intermediate size is prespecified, reduces to a promising alternative as well to current network training algorithms to the NTP.
展开▼