首页> 外文学位 >Neural networks and handwritten signature verification.
【24h】

Neural networks and handwritten signature verification.

机译:神经网络和手写签名验证。

获取原文
获取原文并翻译 | 示例

摘要

In this thesis, various neural networks are trained to detect true signatures of several subjects, and reject forgeries of these same signatures. The forgeries presented to the network are unpracticed forgeries, or nonprofessional, casual forgeries. Casual forgeries are by far the most prevalent type of forgery, resulting in large monetary losses nationwide.; A small number of training signatures are required to teach a neural network to perform the signature classification. Forgeries are required in the training set, but it is shown that true signatures of other subjects suffice as forgeries. In this way, actual forgeries need not be collected to train a network. The trained neural networks are tested on new signatures, resulting in an error rate of 3% rejection of true signatures as forgeries, and 3% acceptance of forgeries as true signatures.; The training sets for the signature verification networks are small, finite sets. The dynamic behavior of a neural network being trained on an infinite input set is well known, while being relatively unknown for the finite training sets. New theoretical results are presented here describing the behavior of a neural network that is trained on a finite size training set. Under special conditions, the optimal weights of a neural network are equivalent to the optimal weights of the synthetic discriminant function, which is another signal processing tool for performing classification.; The backpropagation learning algorithm for neural networks is examined in greater detail to reveal limits on the learning parameters. The mean square error performance surface is examined for a single nonlinear neuron, and shown to be similar to the performance surface for a single linear neuron, although there are choices of inputs and outputs which give rise to a performance surface that is very different from that of a single linear neuron. Lastly, an alternative to the backpropagation learning algorithm is presented which utilizes a step output function for binary classification. This modified algorithm is applied to the exclusive-or problem, which tests the networks ability to nonlinearly partition the input space. The modified algorithm learns the exclusive-or classification much faster and with fewer difficulties than the conventional backpropagation algorithm.
机译:在本文中,训练了各种神经网络以检测多个对象的真实签名,并拒绝伪造这些相同签名。呈现给网络的伪造品是未经实践的伪造品,或者是非专业的临时伪造品。迄今为止,临时伪造是最普遍的伪造类型,在全国范围内造成大量金钱损失。需要少量训练签名来教一个神经网络执行签名分类。训练集中需要伪造品,但事实证明,其他科目的真实签名足以构成伪造品。这样,就不需要收集实际的伪造品来训练网络。训练有素的神经网络将在新签名上进行测试,导致错误率3%拒绝将真实签名作为伪造,将3%接受伪造作为真实签名。签名验证网络的训练集是小的有限集。在无限输入集上训练的神经网络的动态行为是众所周知的,而在有限训练集上则相对未知。这里提供了新的理论结果,描述了在有限大小的训练集上训练的神经网络的行为。在特殊条件下,神经网络的最佳权重等于合成判别函数的最佳权重,后者是进行分类的另一种信号处理工具。对神经网络的反向传播学习算法进行了更详细的研究,以揭示学习参数的局限性。对于单个非线性神经元,检查了均方误差性能表面,并且显示出与单个线性神经元的性能表面相似,尽管存在多种选择的输入和输出会产生与该性能表面非常不同的性能表面。单个线性神经元最后,提出了一种反向传播学习算法的替代方法,该算法利用步进输出功能进行二进制分类。将此修改后的算法应用于“异或”问题,该问题测试网络对输入空间进行非线性划分的能力。改进后的算法比传统的反向传播算法学习异或分类要快得多,困难也更少。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号