Modeling is done with three classes:
The first three are all derived from GaussianModelBase.
In addition, there are several other classes supporting operations on collections of models:
Say something about scoring here.
Adaptation interfaces are available at many levels in the AM package. Also, there are several styles of interface, reflecting both the nature of the underlying algorithms and user demand.
Class | One-shot? | Begin/End? | Relevance | Algorithm | Notes |
---|---|---|---|---|---|
SimpleGaussianModel | Yes | Yes | Yes | Closed form | The simplest case |
GaussianMixtureModel | Yes | No | Yes | EM | |
Hmm | Yes | Yes | No | Baum-Welch | See also UttLattice |
GmmMgr | No | Yes | No | EM | |
HmmMgr | No | Partial | No | ?? | |
UttLattice | Yes | ?? | No | Baum-Welch | |
AdaptingGmmClassifier | Yes | No | Yes | EM | |
AdaptingGmmClassProcessor | Yes | No | Yes | EM |
Notes:
- The GaussianMixtureModel adapt() function can do multiple EM iterations over a single collection of datapoints. Each iteration re-estimates the weights, means, and variances used in the next iteration.
- The GmmMgr class supports accumulation and the application of accumulators. This is used by the Hmm and UttLattice classes in their implementation of Baum-Welch training of Hmms. It has the potential to be used for an external EM implementation, but currently there isn’t one.
- Note about HmmMgr
- Note about UttLattice
- The AdaptingGmmClassifier class uses the one-shot adapt() function in GaussianMixtureModel to do adaptation.
- The AdaptingGmmClassProcessor class uses the one-shot interface of the AdaptingGmmClassifier to do adaptation. An input event consists of a single label and a sequence of training frames; these frames are then use together in several EM iterations (see note 1.)