लोड हो रहा है...

Multi-armed Bandit Models for the Optimal Design of Clinical Trials: Benefits and Challenges

Multi-armed bandit problems (MABPs) are a special type of optimal control problem well suited to model resource allocation under uncertainty in a wide variety of contexts. Since the first publication of the optimal solution of the classic MABP by a dynamic index rule, the bandit literature quickly d...

पूर्ण विवरण

में बचाया:
ग्रंथसूची विवरण
में प्रकाशित:Stat Sci
मुख्य लेखकों: Villar, Sofía S., Bowden, Jack, Wason, James
स्वरूप: Artigo
भाषा:Inglês
प्रकाशित: 2015
विषय:
ऑनलाइन पहुंच:https://ncbi.nlm.nih.gov/pmc/articles/PMC4856206/
https://ncbi.nlm.nih.gov/pubmed/27158186
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.1214/14-STS504
टैग : टैग जोड़ें
कोई टैग नहीं, इस रिकॉर्ड को टैग करने वाले पहले व्यक्ति बनें!