|
I am interested in modeling a continuous time series using a discrete time hidden markov model that can account for, or rather condition on, any fixed size h subset of a fixed number of latent states (X). In other words, output variable Y at time t depends on an h size vector of X's where each dimension takes on a specific value (ie X=ABCD where h=4). So more concretely, I would like to embed depth into the time series prediction using latent, exogenous, discrete valued variables as oppose to the traditional autoregressive approach that only looks at past observations. The canonical HMM is not fit for the task since there's one hidden variable per output variable observation, Y. Would perhaps a Factorial HMM suffice where we can have h=4 hidden variables per output observation Y ? Im not to clear on the practical applications of Factorial HMM's and as an alternative have looked at building a simpler directed Bayesian network to account for the 2^4 different conditional probability distributions present in this problem. I would prefer to use a HMM or any other graphical model that can take into account non-stationarity (ie something suited for discrete time series prediction). Thanks in advance for any help with this. |