1、Generalizing Bayesian Optimization withLikelihood-free Inference andDecision-theoretic Entropies1Lantao YuComputer Science Department,Stanford UniversityBlack-box Global Optimization2Suppose we have a noisy“black-box”functionf.Can query f:Assume:Observations are noisy:y f(x)+Each function query is c
2、ostly-E.g.in money,time,labor,etc.Goal of this task:estimate the location of global optima of f Budget of T queriesBlack-box Global Optimization3Suppose we have a noisy“black-box”functionf.Goal:estimate the location of global optima of f.A popular method is Bayesian optimization(BO)Leverages a proba
3、bilistic model of fto sequentially choose queries.The model can:incorporate prior beliefs about f(e.g.smoothness)tell us where we are certain vs uncertain about fExploration&exploitation tradeoff Sample efficient optimization.Probabilistic model of fUse model to choose queriesBayesian OptimizationIn
4、formation-based Bayesian OptimizationBayesian OptimizationInformation-based Bayesian OptimizationInitial dataset of(x,y)pairs(function observations)can be empty set.Bayesian OptimizationInformation-based Bayesian OptimizationOver a sequence of T iterations:Bayesian OptimizationInformation-based Baye
5、sian OptimizationOptimize an acquisition function.-aims to capture value of querying f at an x.-defined using our probabilistic model.Chooses next x to query.Bayesian OptimizationInformation-based Bayesian OptimizationQuery f on chosen x,observe yBayesian OptimizationInformation-based Bayesian Optim
6、izationUpdate dataset with new(x,y)pairBayesian Optimization(unknown)black-box function fdatasetDt=(x,y)Unknown black-box f,and dataset of(x,y)pairs.Bayesian Optimization-VisualizationSamples from posterior distribution over functionsGiven this dataset,can infer f using a probabilistic model.Bayesia