next up previous
Next: A Generalization of AIS-BN: Up: AIS-BN: Adaptive Importance Sampling Previous: Heuristic Initialization in AIS-BN

Selection of Parameters

There are several tunable parameters in the AIS-BN algorithm. We base the choice of these parameters on the Central Limit Theorem (CLT). According to CLT, if Z1, Z2, ..., Zn are independent and identically distributed random variables with E(Zi) = $ \mu_{Z}^{}$ and Var (Zi) = $ \sigma_{Z}^{2}$, i = 1, ..., n, then $ \overline{Z}$ = (Z1 + ... + Zn)/n is approximately normally distributed when n is sufficiently large. Thus,

$\displaystyle \lim_{n\rightarrow \infty }^{}$P($\displaystyle {\frac{\left\vert \overline{Z}-\mu_{z}\right\vert }{\mu_{z}}}$ $\displaystyle \geq$ $\displaystyle {\frac{\sigma_{Z}/\sqrt{n}}{\mu_z}}$ . t) = $\displaystyle {\frac{2}{\sqrt{ 2\pi }}}$$\displaystyle \int_{t}^{\infty}$e-x2/2dx  . (11)

Although this approximation holds when n approaches infinity, CLT is known to be very robust and lead to excellent approximations even for small n. The formula of Equation 11 is an ( $ \varepsilon_{r}^{}$, $ \delta$) Relative Approximation, which is an estimate $ \overline{\mu}$ of $ \mu$ that satisfies

P($\displaystyle {\frac{\left\vert\overline{\mu}-\mu\right\vert}{\mu}}$ $\displaystyle \geq$ $\displaystyle \varepsilon_{r}^{}$) $\displaystyle \leq$ $\displaystyle \delta$  .

If $ \delta$ has been fixed,

$\displaystyle \varepsilon_{r}^{}$ = $\displaystyle {\frac{\sigma_{Z}/\sqrt{n}}{\mu_{z}}}$ . $\displaystyle \Phi_{Z}^{-1}$($\displaystyle {\frac{\delta }{2}}$)  ,

where $ \Phi_{Z}^{}$(z) = $ {\frac{1}{\sqrt{2\pi}}}$$ \int_{z}^{\infty}$e-x2/2dx. Since in our sampling problem, $ \mu_{z}^{}$ (corresponding to Pr($ \bf E$) in Figure 2) has been fixed, setting $ \varepsilon_{r}^{}$ to a smaller value amounts to letting $ \sigma_{Z}^{}$/$ \sqrt{n}$ be smaller. So, we can adjust the parameters based on $ \sigma_{Z}^{}$/$ \sqrt{n}$, which can be estimated using Equation 3. It is also the theoretical intuition behind our recommendation wk $ \propto$ 1/$ \widehat{\sigma}^{k}_{}$ in Section 3.1. While we expect that this should work well in most networks, no guarantees can be given here -- there exist always some extreme cases in sampling algorithms in which no good estimate of variance can be obtained.


next up previous
Next: A Generalization of AIS-BN: Up: AIS-BN: Adaptive Importance Sampling Previous: Heuristic Initialization in AIS-BN
Jian Cheng 2000-10-01