Typical concepts concerning memorizing capa- bility of multilayerneural networks are statistical capacity and Vapnik-Chervonenkis (VC)dimensio. These are differently de- fined each other according tointended applications. Although for the VC dimension several tighterupper bounds have been proposed, even if limited to networks withlinear threshold ele- ments, in literature, upper bounds on thestatistical capacity are available only by the order of magnitude. Weargue first that the proposed or ordinary formulation of the upperbound on the statistical capacity depends strongly on, and thus, itis possibly expressed by the number of the first hidden layer units.
展开▼