Search

Results: 2
Multi-armed Bandit Models for the Optimal Design of Clinical Trials
Multi-armed bandit problems (MABPs) are a special type of optimal control problem well suited to model resource allocation under uncertainty in a wide variety of contexts. Since the first publication of the optimal solution of...
Published by: