首页> 外文期刊>International Journal for Numerical Methods in Engineering >Fixed-precision randomized low-rank approximation methods for nonlinear model order reduction of large systems
【24h】

Fixed-precision randomized low-rank approximation methods for nonlinear model order reduction of large systems

机译:用于大型系统的非线性模型顺序的固定精度随机低秩近似方法

获取原文
获取原文并翻译 | 示例
           

摘要

Many model order reduction (MOR) methods employ a reduced basis V is an element of Rmxk to approximate the state variables. For nonlinear models, V is often computed using the snapshot method. The associated low-rank approximation of the snapshot matrix A is an element of Rmxn can become very costly as m,n grow larger. Widely used conventional singular value decomposition methods have an asymptotic time complexity of O(min(mn2,m2n)), which often makes them impractical for the reduction of large models with many snapshots. Different methods have been suggested to mitigate this problem, including iterative and incremental approaches. More recently, the use of fast and accurate randomized methods was proposed. However, most work so far has focused on fixed-rank approximations, where rank k is assumed to be known a priori. In case of nonlinear MOR, stating a bound on the precision is usually more appropriate. We extend existing research on randomized fixed-precision algorithms and propose a new heuristic for accelerating reduced basis computation by predicting the rank. Theoretical analysis and numerical results show a good performance of the new algorithms, which can be used for computing a reduced basis from large snapshot matrices, up to a given precision epsilon.
机译:许多模型顺序减少(Mor)方法使用减少的基础V是RMXK的一个元素,以近似态变量。对于非线性模型,v通常使用快照方法计算。快照矩阵A的相关低秩近似是RMXN的元素可以变得非常昂贵,因为M,n变大。广泛使用的常规奇异值分解方法具有O的渐近时间复杂度(min(mn2,m2n)),这通常使它们对具有许多快照的大型模型进行了不切实际的。已经提出了不同的方法来缓解此问题,包括迭代和增量方法。最近,提出了使用快速准确的随机方法。然而,到目前为止的大多数工作都集中在固定秩近似,其中假设等级k知道先验。在非线性摩尔的情况下,精确度的绑定通常更为合适。我们通过预测等级来扩展对随机固定精密算法的现有研究,并提出了一种加速减少基础计算的新启发式。理论分析和数值结果显示了新算法的良好性能,可用于计算从大快照矩阵的减少的基础,直到给定的精度epsilon。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号