首页> 外文期刊>Computers & Chemical Engineering >ReLU networks as surrogate models in mixed-integer linear programs
【24h】

ReLU networks as surrogate models in mixed-integer linear programs

机译:ReLU网络作为混合整数线性程序中的替代模型

获取原文
获取原文并翻译 | 示例
           

摘要

We consider the embedding of piecewise-linear deep neural networks (ReLU networks) as surrogate models in mixed-integer linear programming (MILP) problems. A MILP formulation of ReLU networks has recently been applied by many authors to probe for various model properties subject to input bounds. The formulation is obtained by programming each ReLU operator with a binary variable and applying the big-M method. The efficiency of the formulation hinges on the tightness of the bounds defined by the big-M values. When ReLU networks are embedded in a larger optimization problem, the presence of output bounds can be exploited in bound tightening. To this end, we devise and study several bound tightening procedures that consider both input and output bounds. Our numerical results show that bound tightening may reduce solution times considerably, and that small-sized ReLU networks are suitable as surrogate models in mixed-integer linear programs. (C) 2019 Elsevier Ltd. All rights reserved.
机译:我们将分段线性深度神经网络(ReLU网络)的嵌入作为混合整数线性规划(MILP)问题中的替代模型。 ReLU网络的MILP公式最近已被许多作者应用,以探究受输入限制的各种模型属性。通过使用二进制变量对每个ReLU运算符进行编程并应用big-M方法来获得公式。配方的效率取决于由big-M值定义的边界的紧密度。当ReLU网络嵌入到更大的优化问题中时,可以在约束紧缩中利用输出边界的存在。为此,我们设计并研究了几种同时考虑输入和输出界限的界限收紧程序。我们的数值结果表明,约束紧缩可以显着减少求解时间,并且小型ReLU网络适合作为混合整数线性程序中的替代模型。 (C)2019 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号