...
首页> 外文期刊>SIAM Journal on Scientific Computing >Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part I: Seeking one eigenvalue
【24h】

Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part I: Seeking one eigenvalue

机译:有限记忆条件下的Hermitian特征问题的最佳预处理方法。第一部分:寻找一个特征值

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Large, sparse, Hermitian eigenvalue problems are still some of the most computationally challenging tasks. Despite the need for a robust, nearly optimal preconditioned iterative method that can operate under severe memory limitations, no such method has surfaced as a clear winner. In this research we approach the eigenproblem from the nonlinear perspective, which helps us develop two nearly optimal methods. The. rst extends the recent Jacobi-Davidson conjugate gradient (JDCG) method to JDQMR, improving robustness and efficiency. The second method, generalized-Davidson+1 (GD+ 1), utilizes the locally optimal conjugate gradient recurrence as a restarting technique to achieve almost optimal convergence. We describe both methods within a unifying framework and provide theoretical justification for their near optimality. A choice between the most efficient of the two can be made at runtime. Our extensive experiments con. rm the robustness, the near optimality, and the efficiency of our multimethod over other state-of-the-art methods.
机译:大型,稀疏的Hermitian特征值问题仍然是一些计算上最具挑战性的任务。尽管需要一种可以在严格的内存限制下运行的健壮,近乎最佳的预处理迭代方法,但尚无此类方法成为明显的赢家。在这项研究中,我们从非线性的角度处理本征问题,这有助于我们开发两种接近最佳的方法。的。首先,将最新的Jacobi-Davidson共轭梯度(JDCG)方法扩展到JDQMR,从而提高了鲁棒性和效率。第二种方法是广义Davidson + 1(GD + 1),它利用局部最优的共轭梯度递归作为一种重启技术来实现几乎最优的收敛。我们在统一的框架内描述这两种方法,并为它们的接近最优提供理论依据。可以在运行时在两者中最有效的之间进行选择。我们广泛的实验与其他最新方法相比,我们的多方法具有强大的鲁棒性,接近最优性和效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号