In this work a fast and e cient training method for block-diagonalrecurrent neural networks is proposed. The method modi es and ex-tends the Simulated Annealing RPROP algorithm, originally developedfor static models, by taking into consideration the architectural char-acteristics and the temporal nature of this category of recurrent neu-ral models. The performance of the proposed algorithm is evaluatedthrough a comparative analysis with a series of algorithms and recur-rent models.
展开▼