Open In Colab

Checkpointing

During high-dimensional Bayesian analysis, which is typically computationally heavy, one must frequently run sampling and/or evidence estimation for long periods of time.

One issue often encountered is that the computing facilities may go offline or timeout during this period, thus discarding any values computed already.

To avoid this issue harmonic supports checkpointing, which allows the user to periodically save the progress of the computation, and then resume from the saved point (the checkpoint).

This example illustrates how one may use harmonic with checkpointing.

Basic problem setup

As always, import the relevant python modules.

[11]:
import numpy as np
import emcee
import sys
sys.path.insert(0, '../../..')
import harmonic as hm

Define a Bayesian posterior function (here we’ll use a simple Gaussian example).

[12]:
def ln_posterior(x, inv_cov):
    """Compute log_e of n dimensional multivariate gaussian.

    Args:

        x: Position at which to evaluate prior.

    Returns:

        double: Value of posterior at x.

    """
    return -np.dot(x,np.dot(inv_cov,x))/2.0

def init_cov(ndim):
    """Initialise random diagonal covariance matrix.

    Args:

        ndim: Dimension of Gaussian.

    Returns:

        cov: Covariance matrix of shape (ndim,ndim).

    """
    cov = np.zeros((ndim,ndim))
    diag_cov = np.ones(ndim)
    np.fill_diagonal(cov, diag_cov)
    return cov

Note that the final function init_cov is used to randomly assign a diagonal covariance proportional to the identiy matrix.

Then define parameters for emcee and the covariance of our Gaussian example.

[13]:
# Define parameters for emcee sampling
ndim = 10                   # number of dimensions
nchains = 200               # total number of chains to compute
samples_per_chain = 5000    # number of samples per chain
nburn = 2000                # number of samples to discard as burn in

# initialise random seed
np.random.seed(1)

# Create covariance matrix
cov = init_cov(ndim)
inv_cov = np.linalg.inv(cov)

For absolute simplicity let’s use the harmonic HyperSphere model and rather than learn the model from training samples we will simply fix the radius of the model.

[14]:
# Instantiate hyper-sphere model
domains_sphere = [np.array([1E0,1E0])] # Not used since not training model.
model = hm.model.HyperSphere(ndim, domains_sphere)

# Fit model by hand by fixing radius
model.set_R(np.sqrt(ndim)) # A good rule of thumb for a Gaussian with a diagonal covariance
model.fitted = True

Checkpointing

Now we need to run the sampler to collect samples but we wish to checkpoint periodically to protect against system crashes or timeouts. One simple way to do this is to execute the following loop.

[15]:
# Set initial random position and state
pos = np.random.rand(ndim * nchains).reshape((nchains, ndim)) * 0.1
rstate = np.random.get_state()

# Define how often to checkpoint the evidence class
chain_iterations = 10

# Instantiate the evidence class
cal_ev = hm.Evidence(nchains, model)

for chain_i in range(chain_iterations):
    # Run the emcee sampler from previous endpoint
    sampler = emcee.EnsembleSampler(nchains, ndim, ln_posterior, args=[inv_cov])
    (pos, prob, rstate) = sampler.run_mcmc(pos, samples_per_chain/chain_iterations, rstate0=rstate)

    # Collect and format samples
    samples = np.ascontiguousarray(sampler.chain[:,:,:])
    lnprob = np.ascontiguousarray(sampler.lnprobability[:,:])
    chains = hm.Chains(ndim)
    chains.add_chains_3d(samples, lnprob)

    # 1) Deserialize the Evidence class
    if chain_i > 0:
        cal_ev = hm.Evidence.deserialize(".temp.gaussian_example_{}.dat".format(ndim))

    # 2) Add these new chains to Evidence class
    cal_ev.add_chains(chains)

    # 3) Serialize the Evidence Class
    cal_ev.serialize(".temp.gaussian_example_{}.dat".format(ndim))

    # Clear memory
    del chains, samples, lnprob, sampler, prob

Of course, it is not typically necessary to deserialize the Evidence class following each checkpoint but only if execution halts.

Compute the Bayesian evidence

Finally we simply compute the learnt harmonic mean estimator. Here, we elect to compute the evidence in log space.

[16]:
ln_evidence, ln_evidence_std = cal_ev.compute_ln_evidence()

Results

For the simple Gaussian example considered, it’s also straightforward to compute the evidence analytically.

[17]:
def ln_analytic_evidence(ndim, cov):
    """Compute analytic ln_e evidence.

    Args:

        ndim: Dimensionality of the multivariate Gaussian posterior

        cov: Covariance matrix dimension nxn.

    Returns:

        double: Value of posterior at x.

    """
    ln_norm_lik = -0.5*ndim*np.log(2*np.pi)-0.5*np.log(np.linalg.det(cov))
    return -ln_norm_lik

Let’s also compute the evidence analytically so that we can compare it to the value computed by harmonic.

[18]:
ln_evidence_analytic = ln_analytic_evidence(ndim, cov)

Let’s compare the value computed by harmonic and analytically.

[19]:
ln_evidence_err = np.log(np.exp(ln_evidence) + np.exp(ln_evidence_std)) - ln_evidence
[20]:
print('ln_evidence (harmonic) = {} +/- {}'.format(ln_evidence, ln_evidence_err))
print('ln_evidence (analytic) = {}'.format(ln_evidence_analytic))
print('nsigma = {}'.format(np.abs(ln_evidence - ln_evidence_analytic) / ln_evidence_err))
ln_evidence (harmonic) = 9.191328725973255 +/- 0.0032251202241564414
ln_evidence (analytic) = 9.189385332046726
nsigma = 0.6025803044405069

As expected, the evidence computed by harmonic is close to that computed analytically.