首页> 外文期刊>IMA Journal of Numerical Analysis >Parametric PDEs: sparse or low-rank approximations?
【24h】

Parametric PDEs: sparse or low-rank approximations?

机译:参数PDES:稀疏或低秩近似值?

获取原文
获取原文并翻译 | 示例
           

摘要

We consider adaptive approximations of the parameter-to-solution map for elliptic operator equations depending on a large or infinite number of parameters, comparing approximation strategies of different degrees of nonlinearity: sparse polynomial expansions, general low-rank approximations separating spatial and parametric variables, and hierarchical tensor decompositions separating all variables. We describe corresponding adaptive algorithms based on a common generic template and show their near-optimality with respect to natural approximability assumptions for each type of approximation. A central ingredient in the resulting bounds for the total computational complexity is a new operator compression result in the case of infinitely many parameters. We conclude with a comparison of the complexity estimates based on the actual approximability properties of classes of parametric model problems, which shows that the computational costs of optimized low-rank expansions can be significantly lower or higher than those of sparse polynomial expansions, depending on the particular type of parametric problem.
机译:我们考虑根据大或无限数量的参数,比较不同程度的非线性的近似策略:稀疏多项式扩展,分离空间和参数变量的一般低秩近似的近似策略。分离所有变量的分层张量分解。我们基于公共通用模板描述了相应的自适应算法,并显示出对每种类型近似的自然近似性假设显示其接近最优性。对于总计算复杂度的所得界限中的中央成分是一种新的操作员压缩导致无限许多参数的情况。我们得出基于参数模型问题类别的实际近似性质的复杂性估计的比较,这表明优化的低级扩展的计算成本可以明显低于或高于稀疏多项式扩展的计算成本,具体取决于稀疏多项式扩展,具体取决于具体类型的参数问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号