Categorygithub.com/d4l3k/go-bayesopt
modulepackage
0.0.0-20191110222447-8506d3040732
Repository: https://github.com/d4l3k/go-bayesopt.git
Documentation: pkg.go.dev

# README

go-bayesopt Build Status GoDoc

A library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.

This project is under active development, if you find a bug, or anything that needs correction, please let me know.

Simple Example

package main

import (
  "log"
  "math"

  "github.com/d4l3k/go-bayesopt"
)

func main() {
  X := bayesopt.UniformParam{
    Max: 10,
    Min: -10,
  }
  o := bayesopt.New(
    []Param{
      X,
    },
  )
  // minimize x^2+1
  x, y, err := o.Optimize(func(params map[Param]float64) float64 {
    return math.Pow(params[X], 2) + 1
  })
  if err != nil {
    log.Fatal(err)
  }
  log.Println(x, y)
}

How does it work?

From https://github.com/fmfn/BayesianOptimization:

Bayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.

BayesianOptimization in action

As you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).

BayesianOptimization in action

This process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.

License

go-bayesopt is licensed under the MIT license.

# Packages

gp is a library for computing Gaussian processes in Go/Golang.

# Functions

BasicBarrier returns -Inf if an x value is outside the param range.
New creates a new optimizer with the specified optimizable parameters and options.
WithBarrierFunc sets the barrier function to use.
WithExploration sets the exploration function to use.
WithMinimize sets whether or not to minimize.
WithOutputName sets the outputs name.
WithRandomRounds sets the number of random rounds to run.
WithRounds sets the total number of rounds to run.

# Constants

DefaultMinimize is the default value of minimize.
DefaultRandomRounds is the default number of random rounds to run.
DefaultRounds is the default number of rounds to run.
No description provided by the author
No description provided by the author

# Variables

DefaultBarrierFunc sets the default barrier function to use.
DefaultExploration uses UCB with 95 confidence interval.
SampleTries is the number of tries a sample function should try before truncating the samples to the boundaries.

# Structs

No description provided by the author
ExponentialParam is an exponentially distributed parameter between 0 and in the range (0, +math.MaxFloat64] whose rate parameter (lambda) is Rate and whose mean is 1/lambda (1).
LogBarrier implements a logarithmic barrier function.
NormalParam is a normally distributed parameter with Mean and StdDev.
Optimizer is a blackbox gaussian process optimizer.
RejectionParam samples from Param and then uses F to decide whether or not to reject the sample.
UCB implements upper confidence bound exploration.
UniformParam is a uniformly distributed parameter between Max and Min.

# Interfaces

BarrierFunc returns a value that is added to the value to bound the optimization.
Exploration is the strategy to use for exploring the Gaussian process.
Param represents a parameter that can be optimized.

# Type aliases

LinearParam is a UniformParam.
OptimizerOption sets an option on the optimizer.