The Echo State Network (ESN) is a recurrent neural network model of temporal learning primarily used to approximate dynamical systems in the absence of complete state. The ESN is characterized by a static, stochastically generated recurrent topology that projects partial state input onto a rich dynamic basis. We isolate ESN topology and investigate the use of Next Ascent (NA) local search to find topologies yielding rich dynamic bases. We define richness as minimization of error when linear regression maps the mass-spring-damper dynamical system onto a basis constructed from partial state inputs. Compared to stochastic topology generation, NA optimized topologies facilitate mappings having 50% less mean squared prediction error. Further, we propose and evaluate an algorithm which constrains ESN topological density and reduces the number of evaluations necessary for NA search to converge by 15%.
展开▼