Skip to content

A Monte Carlo method for importance sampling and optimization.

License

Notifications You must be signed in to change notification settings

CeriAnneLaureyssens/CrossEntropy.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cross-Entropy method

By Ceri-Anne Laureyssens

The notebook gives an introduction to cross-entropy and its use in the cross-entropy method. Cross-entropy is a metric used to measure the Kullback-Leibler (KL) distance between two probability distributions (f and g). The cross-entropy method is a Monte Carlo method for importance sampling and optimization and is found by minimizing the previously called KL distance in between distribution f and g (parameterized by θ). This is equivalent to choosing θ that minimizes the cross-entropy.

The notebook provides an implementation of the cross entropy method for optimizing multivariate time series distributions. Suppose we have a timeseries X = {x₁, ..., xₙ} where each xᵢ is a vector of dimension m. The cross_entropy_method function can handle two different scenarios:

  1. The time series is sampled IID from a single distribution p: xᵢ ~ p(x). In this case, the distribution is represented as a Dict{Symbol, Tuple{Sampleable, Int64}}. The dictionary will contain m symbols, one for each variable in the series. The Sampleable object represents p and the integer is the length of the timeseries (N).
  2. The time series is sampled from a different distribution at each timestep pᵢ: xᵢ ~ pᵢ(x). In this case, the distribution is also represented as a Dict{Symbol, Tuple{Sampleable, Int64}}.

Finishing off the notebook provides a fun little example implementing the CE method in the importance sampling technique.

Build StatusCoverage Status

About

A Monte Carlo method for importance sampling and optimization.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Julia 100.0%