Hello all!
I'm happy to present Yet Another Hidden Markov Model (yahmm), a new package written by a friend and myself in Python, attempting to make the usage of HMMs easy, while also being comprehensive in scope and written in Cython for speed.
Features:
- Build your graph node by node and edge by edge, instead of using matrix format (but you can still use matrix format if you'd like!)
- States are not limited to a single distribution type. Not only can different states in the same model have different distributions, but a single state can be an arbitrary weighted mixture of any distributions you'd like, or just be silent.
- Normal, Exponential, Uniform Gamma, Inverse-Gamma, Discrete, and Lambda distributions implemented, as well as Gaussian, Uniform, and Triangular Kernel Densities (and of course, Mixtures), and a simple way to define your own distributions.
- Implements forward, backward, forward-backward, viterbi, all in O( states * edges ) time, instead of doing full-graph computations, for significant speedups.
- Both Baum-Welch and Viterbi training implemented, allowing the possibility of tied-states.
- Auto-normalization of edge weights to sum to 1
- Options to simplify the graph structure by merging silent states with a single probability 1 out edge
- Writing and reading of models to allow for time-intensive training followed by human-readable storage for future use
- Sampling from the model
Check out the github repo here: https://github.com/jmschrei/yahmm
Get it quickly with pip, just use "pip install yahmm".
Please check it out, and let us know what you think! Comments always appreciated.
Requirements:
- numpy >= 1.8.0
- scipy >= 0.13.3
- networkx >= 1.8.1
- matplotlib >= 1.3.1
[link][8 comments]