Nicolò Cesa-Bianchi: Bandits, auctions, and pricing

Multi-armed bandits are a mathematical framework for studying sequential decision-making problems with partial feedback. Recommendation, personalization, hyperparameter tuning, and clinical trials are examples of application areas that use this framework. In this talk, I will introduce some basic algorithms for solving bandit problems and show applications of these algorithms to digital markets.