首页> 外文期刊>Smart Grid, IEEE Transactions on >Markov Decision Process-Based Resilience Enhancement for Distribution Systems: An Approximate Dynamic Programming Approach
【24h】

Markov Decision Process-Based Resilience Enhancement for Distribution Systems: An Approximate Dynamic Programming Approach

机译:马尔可夫决策过程的分配系统的弹性增强:一种近似的动态规划方法

获取原文
获取原文并翻译 | 示例
           

摘要

Because failures in distribution systems caused by extreme weather events directly result in consumers' outages, this paper proposes a state-based decision-making model with the objective of mitigating loss of load to improve the distribution system resilience throughout the unfolding events. The system topologies including on/off states of feeder lines are modeled as Markov states, and the probabilities from one Markov state to another Markov state throughout the unfolding events are determined by the component failure caused by the unfolding events. A recursive optimization model based on Markov decision processes (MDP) is developed to make state-based actions, i.e., system reconfiguration, at each decision time. To overcome the curse of dimensionality caused by enormous states and actions, an approximate dynamic programming (ADP) approach based on post-decision states and iteration is used to solve the proposed MDP-based model. IEEE 33-bus system and IEEE 123-bus system are used to validate the proposed model.
机译:由于极端天气事件引起的分配系统中的失败直接导致消费者的中断,提出了一种基于国家决策模型,其目的是减轻负载损失,以改善整个展开事件的分配系统弹性。包括馈线/关闭馈线的系统拓扑以Markov状态建模,并且在整个展开事件中的一个马尔可夫状态到另一个马尔可夫状态的概率由由展开事件引起的组件故障确定。基于Markov决策过程(MDP)的递归优化模型被开发为在每个决定时间下制作基于状态的动作,即系统重新配置。为了克服巨大的状态和行动引起的维度诅咒,使用基于决策后状态和迭代的近似动态编程(ADP)方法来解决所提出的基于MDP的模型。 IEEE 33-Bus系统和IEEE 123总线系统用于验证所提出的模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号