nimbleHMC version 0.2.0 released, providing improved HMC performance
nimbleHMC provides Hamiltonian Monte Carlo samplers for use with NIMBLE, in particular NUTS samplers. NIMBLE’s HMC samplers can be flexibly assigned to a subset of model parameters, allowing users to consider various sampling configurations.
We’ve released version 0.2.0 of nimbleHMC, which includes a new default NUTS sampler inspired by Stan’s implementation of NUTS. It also provides an updated version of our previous NUTS sampler (which is based on the original Hoffman and Gelman paper, and is now called the ‘NUTS_classic’ sampler in NIMBLE) that fixes performance issues in version 0.1.1.
Version 1.0.1 of NIMBLE released, fixing a bug in version 1.0.0 affecting certain models
Version 1.0.0 of NIMBLE released, providing automatic differentiation, Laplace approximation, and HMC sampling
We’ve released the newest version of NIMBLE on CRAN and on our website. NIMBLE is a system for building and sharing analysis methods for statistical models, especially for hierarchical models and computationally-intensive methods (such as MCMC and SMC).
Version 1.0.0 provides substantial new functionality. This includes:
- A Laplace approximation algorithm that allows one to find the MLE for model parameters based on approximating the marginal likelihood in models with continuous random effects/latent process values.
- A Hamiltonian Monte Carlo (HMC) MCMC sampler implementing the NUTS algorithm (available in the newly-released nimbleHMC package).
- Support in NIMBLE’s algorithm programming system to obtain derivatives of functions and arbitrary calculations within models.
- A parameter transformation system allowing algorithms to work in unconstrained parameter spaces when model parameters have constrained domains.
These are documented via the R help system and a new section at the end of our User Manual. We’re excited for users to try out the new features and let us know of their experiences. In particular, given these major additions to the NIMBLE system, we anticipate the possibility of minor glitches. The best place to reach out for support is still the nimble-users list.
In addition to the new functionality above, other enhancements and bug fixes include:
- Fixing a bug (previously reported in a nimble-users message) giving incorrect results in NIMBLE’s cross-validation function (`runCrossValidate`) for all but the ‘predictive’ loss function for NIMBLE versions 0.10.0 – 0.13.2.
- Fixing a bug in conjugacy checking causing incorrect identification of conjugate relationships in models with unusual uses of subsets, supersets, and slices of multivariate normal nodes.
- Improving control of the `addSampler` method for MCMC.
- Improving the WAIC system in a few small ways.
- Enhancing error trapping and warning messages.
Please see the NEWS file in the package source for more details.
Version 0.13.1 of NIMBLE released
We’ve released the newest version of NIMBLE on CRAN and on our website. This version is purely a bug fix release that fixes a bug introduced in our new handling of predictive nodes in version 0.13.0 (released in November). If you installed version 0.13.0, please upgrade to 0.13.1.
Bug in newly-released version 0.13.0 affecting MCMC for models with predictive nodes
We recently released version 0.13.0, which has some improvements in how we handle predictive nodes in NIMBLE’s MCMC engine.
Unfortunately, we realized (thanks to a user post from a couple days ago) that there is a bug in this new approach to predictive nodes.
If you haven’t upgraded to version 0.13.0, simply wait to upgrade until we release a bug fix in 0.13.1 in the next couple weeks.
If you have upgraded to version 0.13.0 and if you have run an MCMC on a model that both (1) has predictive nodes and (2) has multivariate nodes, then the bug might affect your results. Please set:
nimbleOptions(MCMCusePredictiveDependenciesInCalculations = TRUE)
and then reconfigure/rebuild and rerun your MCMC. The option above will ensure that the MCMC behaves as it would in previous versions of NIMBLE.
Version 0.13.0 of NIMBLE released
- Thoroughly revamping handling of posterior predictive nodes in the MCMC system, in particular that MCMC samplers, by default, will now exclude predictive dependencies from internal sampler calculations. This should improve MCMC mixing for models with predictive nodes. Posterior predictive nodes are now sampled conditional on all other model nodes at the end of each MCMC iteration.
- Adding functionality to the MCMC configuration system, including a new replaceSamplers method and updates to the arguments for the addSamplers method.
- Adding an option to the WAIC system to allow additional burnin (in addition to standard MCMC burnin) before calculating online WAIC, thereby allowing inspection of initial samples without forcing them to be used for WAIC.
- Warning users of unused constants during model building.
- Fixing bugs that prevented use of variables starting with ‘logProb’ or named ‘i’ in model code.
- Fixing a bug to prevent infinite recursion in particular cases in conjugacy checking.
- Fixing a bug in simulating from dcar_normal nodes when multiple nodes passed to simulate.
We’re looking for a programmer
The NIMBLE project anticipates having some funding for a part-time programmer to implement statistical algorithms and make improvements in nimble’s core code. Examples may include building adaptive Gaussian quadrature in nimble’s programming system and expanding nimble’s hierarchical model system. Remote work is possible. This is not a formal job solicitation, but please send a CV/resume to nimble.stats@gmail.com if you are interested so we can have you on our list. Important skills will be experience with hierarchical statistical modeling algorithms, R programming, and nimble itself. Experience with C++ will be helpful but not required.
NIMBLE virtual short course, January 4-6, 2023
We’ll be holding a virtual training workshop on NIMBLE, January 4-6, 2023 from 8 am to 1 pm US Pacific (California) time each day. NIMBLE is a system for building and sharing analysis methods for statistical models, especially for hierarchical models and computationally-intensive methods (such as MCMC and SMC).
Recently we added support for automatic differentiation (AD) to NIMBLE in a beta release, and the workshop will cover NIMBLE’s AD capabilities in detail.
The workshop will cover the following material:
- the basic concepts and workflows for using NIMBLE and converting BUGS or JAGS models to work in NIMBLE.
- overview of different MCMC sampling strategies and how to use them in NIMBLE, including Hamiltonian Monte Carlo (HMC).
- writing new distributions and functions for more flexible modeling and more efficient computation.
- tips and tricks for improving computational efficiency.
- using advanced model components, including Bayesian non-parametric distributions (based on Dirichlet process priors), conditional auto-regressive (CAR) models for spatially correlated random fields, Laplace approximation, and reversible jump samplers for variable selection.
- an introduction to programming new algorithms in NIMBLE.
- use of automatic differentiation (AD) in algorithms.
- calling R and compiled C++ code from compiled NIMBLE models or functions.
If you are interested in attending, please pre-register. Registration fees will be $125 (regular) or $50 (student). We are also offering a process (see the pre-registration form) for students to request a fee waiver.
The workshop will assume attendees have a basic understanding of hierarchical/Bayesian models and MCMC, the BUGS (or JAGS) model language, and some familiarity with R.
Beta version of NIMBLE with automatic differentiation, including HMC sampling and Laplace approximation
We’re excited to announce that NIMBLE now supports automatic differentiation (AD), also known as algorithmic differentiation, in a beta version available on our website. In this beta version, NIMBLE now provides:
- Hamiltonian Monte Carlo (HMC) sampling for an entire parameter vector or arbitrary subsets of the parameter vector (i.e., combined with other samplers for the remaining parameters).
- Laplace approximation for approximate integration over latent states in a model, allowing maximum likelihood estimation and MCMC based on the marginal likelihood (via the RW_llFunction samplers).
- The ability for users and algorithm developers to write nimbleFunctions that calculate derivatives of functions, including many but not all mathematical operations that are supported in the NIMBLE language.
We’re making this beta release available to allow our users to test and evaluate the AD functionality and the new algorithms, but it is not recommended for production use at this stage. So please give it a try, and let us know of any problems or suggestions you have, either via the nimble-users list, bug reports to our GitHub repository, or email to nimble.stats@gmail.com.
You can download the beta version and view an extensive draft manual for the AD functionality.
We plan to release this functionality in the next NIMBLE release on CRAN in the coming months.
Version 0.12.2 of NIMBLE released, including an important bug fix for some models using Bayesian nonparametrics with the dCRP distribution
We’ve released the newest version of NIMBLE on CRAN and on our website. NIMBLE is a system for building and sharing analysis methods for statistical models, especially for hierarchical models and computationally-intensive methods (such as MCMC and SMC).
Version 0.12.2 is a bug fix release. In particular, this release fixes a bug in our Bayesian nonparametric distribution (BNP) functionality that gives incorrect MCMC results for some models, specifically when using the dCRP distribution when the parameters of the mixture components (i.e., the clusters) have hyperparameters (i.e., the base measure parameters) that are unknown and sampled during the MCMC. Here is an example basic model structure that is affected by the bug:
k[1:n] ~ dCRP(alpha, n) for(i in 1:n) { y[i] ~ dnorm(mu[k[i]], 1) mu[i] ~ dnorm(mu0, 1) ## mixture component parameters with hyperparameter } mu0 ~ dnorm(0, 1) ## unknown cluster hyperparameter
(There is no problem without the hyperparameter layer – i.e., if mu0 is a fixed value – which is the situation in many models.)
We strongly encourage users using models with this type of structure to rerun their analyses, and we apologize for this issue.
Other changes in this release include:
- Fixing an issue with reversible jump variable selection under a similar situation to the BNP issue discussed above (in particular where there are unknown hyperparameters of the regression coefficients being considered, which would likely be an unusual use case).
- Fixing a bug preventing setup of conjugate samplers for dwishart or dinvwishart nodes when using dynamic indexing.
- Fixing a bug preventing use of truncation bounds specified via `data` or `constants`.
- Fixing a bug preventing MCMC sampling with the LKJ prior for 2×2 matrices.
- Fixing a bug in `runCrossValidate` affecting extraction of multivariate nodes.
- Fixing a bug producing incorrect subset assignment into logical vectors in nimbleFunction code.
- Fixing a bug preventing use of `nimbleExternalCall` with a constant expression.
- Fixing a bug preventing use of recursion in nimbleFunctions without setup code.
Please see the release notes on our website for more details.