Abstract

We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden state transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results.

Original languageAmerican English
Pages1739-1747
Number of pages9
StatePublished - 1 Jan 2013
Event30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States
Duration: 16 Jun 201321 Jun 2013

Conference

Conference30th International Conference on Machine Learning, ICML 2013
Country/TerritoryUnited States
CityAtlanta, GA
Period16/06/1321/06/13

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Sociology and Political Science

Fingerprint

Dive into the research topics of 'On learning parametric-output HMMs'. Together they form a unique fingerprint.

Cite this