This paper describes preliminary research on the application of Markov Decision Processes (MDP) to Real Option Valuation (ROV) and the optimisation of mine scheduling. The MDP framework is a novel approach to option valuation and scheduling in mining operations. A learning agent is introduced into the valuation process of an open pit, where prices and block ore grades have probabilistic values. The prices are modelled using a mean reverting diffusion process and the block grades using sequential Gaussian simulation. The agent is asked to learn which production parameters should be used in order to maximise the overall value of the project. The introduction of the agent permits a real option approach to mine valuation, such that the value associated with the robustness of a design to uncertainty can be measured. A simulated example is used in which there are ten blocks to be extracted under conditions of grade and price uncertainty. Using policy iteration, an optimal policy is generated and the value of production options is found. The potential financial gains from applying MDPs to mine valuation and optimisation are substantial and warrant further investigation.
展开▼