forwardBackward {RHmm} | R Documentation |
The forward-backward procedure is used to compute quantities used in the Baum-Welch algorithm.
forwardBackward(HMM, obs)
HMM |
a HMMClass or a HMMFitClass object |
obs |
a vector (matrix) of observations, or a list of vectors (or matrices) if there are more than one samples |
If obs is one sample, a list of following elements, if obs is a list of samples, a list of list of following elements. See note for mathematical definitions.
Alpha |
The matrix of 'forward' probabilities (size: number of obs. times number of hidden states) |
Beta |
The matrix of 'backward' probabilities (size: number of obs. times number of hidden states) |
Gamma |
The matrix of probabilities of being at time t in state i (size: number of obs. times number of hidden states) |
Xsi |
The matrix of probabilities of being in state i at time t and being in state j at time t + 1 (size: number of obs. times number of hidden states) |
Rho |
The vector of probabilities of seeing the partial sequence obs[1] ... obs[t] (size number of obs.) |
LLH |
Log-likelihood |
Let obs=(obs[1], ... obs[T]) be the
vector of observations, and O=(O[t],
1, ..., T), the corresponding random variables. Let (Q[t], t=1, ..., T)
be the hidden Markov chain whose values are in {1, ..., nStates}
We have the
following definitions:
Alpha[i][t] =
P(O[1]=obs[1],,...,,O[t]=obs[t],,Q[t]=i | HMM) which is
the probability of seeing the partial sequence
obs[1], ..., obs[t] and ending up
in state i at time t.
Beta[i][t] =
P(O[t+1]=obs[t+1],,...,,O[T]=obs[T],,Q[t]=i | HMM) which
is the probability of the ending partial sequence obs[t+1], ..., obs[T]
given that we started at state i at time t.
Gamma[i][t] = P(Q[t]=i | O=obs, HMM) which is the probability of being in state i
at time t for the state sequence O=obs.
Xsi[i][t]=P(Q[t]=i, Q[t+1]=j | O=obs, HMM) which is the probability of being
in state i at time t and being in state j at time t + 1.
Rho[t] = P(O[1]=obs[1], ..., O[t]=obs(t) | HMM) witch is probabilities of seeing
the partial sequence obs[1] ... obs[t].
LLH=ln(Rho[T])
Jeff A. Bilmes (1997) A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models http://ssli.ee.washington.edu/people/bilmes/mypapers/em.ps.gz
data(n1d_3s) #Fits an 2 states gaussian model for geyser duration Res_n1d_3s <- HMMFit(obs_n1d_3s, nStates=3) #Forward-backward procedure fb <- forwardBackward(Res_n1d_3s, obs_n1d_3s)