首页> 外文会议>Annual American Control Conference >Solving Large-scale Quadratic Programs with Tensor-train-based Algorithms with Application to Trajectory Optimization
【24h】

Solving Large-scale Quadratic Programs with Tensor-train-based Algorithms with Application to Trajectory Optimization

机译:用基于张量训练的算法求解大规模二次程序及其在轨迹优化中的应用

获取原文

摘要

In this paper, we develop an efficient interior point method (IPM) for convex quadratic programming (QP). The proposed algorithm keeps all relevant variables in a compressed format enabled by the tensor-train (TT) decomposition. The algorithm, called TT-IPM, requires much less memory, even when compared to IPMs that utilize sparse arrays, and is able to solve large-scale QPs. Under certain assumptions, we prove that the TT-IPM inherits the superlinear convergence property of traditional IPMs. Furthermore, the TT-IPM uses storage that scales polylogarithmically with the number of variables and the TT-ranks. Finally, we illustrate the computation time and storage savings of TT-IPM in a trajectory optimization problem. We show that even in fairly sparse optimization problems with percentage of nonzero elements in the problem data close to 0.0002 percent, the tensor-train representation allows TT-IPM to outperform state-of-the-art IPM that use sparse arrays.
机译:在本文中,我们为凸二次规划(QP)开发了一种有效的内点方法(IPM)。所提出的算法将所有相关变量保持在通过张量火车(TT)分解启用的压缩格式中。即使与使用稀疏阵列的IPM相比,该算法称为TT-IPM,也需要更少的内存,并且能够解决大规模QP。在某些假设下,我们证明TT-IPM继承了传统IPM的超线性收敛特性。此外,TT-IPM使用的存储根据变量的数量和TT等级对数进行缩放。最后,我们说明了轨迹优化问题中TT-IPM的计算时间和存储节省。我们表明,即使在相当稀疏的优化问题中,问题数据中的非零元素百分比接近0.0002%,张量-列表示也允许TT-IPM优于使用稀疏数组的最新IPM。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号