- BNT supports many types of
conditional probability distributions (nodes),
and it is easy to add more.
- Tabular (multinomial)
- Gaussian
- Softmax (logistic/ sigmoid)
- Multi-layer perceptron (neural network)
- Noisy-or
- Deterministic
- BNT supports decision and utility nodes, as well as chance
nodes,
i.e., influence diagrams as well as Bayes nets.
- BNT supports static and dynamic BNs (useful for modelling dynamical systems
and sequence data).
- BNT supports many different inference algorithms,
and it is easy to add more.
- Exact inference for static BNs:
- junction tree
- variable elimination
- brute force enumeration (for discrete nets)
- linear algebra (for Gaussian nets)
- Pearl's algorithm (for polytrees)
- quickscore (for QMR)
- Approximate inference for static BNs:
- likelihood weighting
- Gibbs sampling
- loopy belief propagation
- Exact inference for DBNs:
- junction tree
- frontier algorithm
- forwards-backwards (for HMMs)
- Kalman-RTS (for LDSs)
- Approximate inference for DBNs:
- Boyen-Koller
- factored-frontier/loopy belief propagation
-
BNT supports several methods for parameter learning,
and it is easy to add more.
- Batch MLE/MAP parameter learning using EM.
(Each node type has its own M method, e.g. softmax nodes use IRLS,
and each inference engine has its own E method, so the code is fully modular.)
- Sequential/batch Bayesian parameter learning (for fully observed tabular nodes only).
-
BNT supports several methods for regularization,
and it is easy to add more.
- Any node can have its parameters clamped (made non-adjustable).
- Any set of compatible nodes can have their parameters tied (c.f.,
weight sharing in a neural net).
- Some node types (e.g., tabular) supports priors for MAP estimation.
- Gaussian covariance matrices can be declared full or diagonal, and can
be tied across states of their discrete parents (if any).
-
BNT supports several methods for structure learning,
and it is easy to add more.
- Bayesian structure learning,
using MCMC or local search (for fully observed tabular nodes only).
- Constraint-based structure learning (IC/PC and IC*/FCI).
- The source code is extensively documented, object-oriented, and free, making it
an excellent tool for teaching, research and rapid prototyping.