Dealing with Model Uncertainty in Markov Decision Processes for Chronic Diseases
/Systems Conversation with Brian Denton, Chair of the Department of Industrial and Operations Engineering, University of Michigan
Optimization of sequential decision-making under uncertainty is important in many contexts, including chronic diseases, but ambiguity in the underlying models introduces significant challenges. In the context of chronic disease management, Markov decision processes (MDPs) have been used to optimize the delivery of medical interventions in a way that balances the immediate harms and costs with the uncertain future health benefits associated with these interventions. Unfortunately, treatment recommendations that result from MDPs can depend heavily on the model of the chronic disease, and there are often multiple plausible models due to conflicting data sources or differing opinions among medical experts. To address this problem, they introduce a new framework in which a decision-maker can consider multiple models of the MDP’s ambiguous parameters and seeks to find a strategy that maximizes the weighted performance with respect to each of these models of the MDP. They establish connections to other models in the stochastic optimization literature, derive complexity results, and establish solution methods for solving these problems. They illustrate their approach in the context of preventative treatment for cardiovascular disease, and end with a summary of the most important conclusions of their study.