首页> 美国政府科技报告 >Markov Decision Process Model for the Optimal Dispatch of Military Medical Evacuation Assets.
【24h】

Markov Decision Process Model for the Optimal Dispatch of Military Medical Evacuation Assets.

机译:军事医疗后送资产优化调度的马尔可夫决策过程模型。

获取原文

摘要

We develop a Markov decision process (MDP) model to examine military evacuation (MEDEVAC) dispatch policies in a combat environment. The problem of deciding which aeromedical asset to dispatch to which service request is complicated by threat conditions at the service locations and the priority class of each casualty event, assuming MEDEVAC requests arrive sequentially, with the location and the priority of each casualty known upon arrival. The United States military uses a 9-line MEDEVAC request system to classify casualties using three priority levels. An armed escort may be required depending on the threat level indicated by the request. The proposed MDP model indicates how to optimally dispatch ambulatory helicopters to casualty events in order to maximize the steady-state system utility. Utility depends on casualty numbers, priority classes, and the locations of MEDEVAC units and casualty event. Instances of the dispatching problem are solved using a value iteration dynamic programming algorithm. Computational examples investigate optimal dispatch policies under different threat situations and potential armed escort delay.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号