The (k,s)-perceptrons partition the input space {0,..., k-1}/sup n/ into s+1 regions using s parallel hyperplanes. Their learning abilities are examined in this paper. The previously studied homogeneous (k, k-1)-perceptron learning algorithm is generalized to the permutably homogeneous (k,s)-perceptron learning algorithm with guaranteed convergence property. We also introduce a powerful learning method that learns any permutably homogeneously separable k-valued logic function given as input.
展开▼
机译:使用S并行超平面将(k,s)-perceptrons将输入空间{0,...,k-1} / sup n /进入s + 1区域分区。他们在本文中审查了他们的学习能力。先前研究的均匀(k,k-1)-perceptron学习算法广泛地通过保证融合属性的较为均匀的(k,s)-perceptron学习算法。我们还介绍了一种强大的学习方法,它学习了作为输入给出的任何可差异的均匀可分离的K值逻辑功能。
展开▼