Conference icon

Conference

Two dissimilarity measures for HMMs and their application in phoneme model clustering

Abstract:

This paper introduces two approximations of the Kullback-Leibler divergence for hidden Markov models (HMMs). The first one is a generalization of an approximation originally presented for HMMs with discrete observation densities. In that case, the HMMs are assumed to be ergodic and the topologies similar. The second one is a modification of the first one. The topologies of HMMs are assumed to be left-to-right with no skips but the models can have different number of states unlike in the first...

Expand abstract

Actions


Authors


More by this author
Institution:
University of Oxford
Department:
Oxford, MPLS, Statistics
Role:
Author
Volume:
1
Publication date:
2002-01-01
ISSN:
1520-6149
URN:
uuid:8604128b-005f-47d6-9cc4-6e5363883427
Source identifiers:
487894
Local pid:
pubs:487894

Terms of use


Metrics


Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP