difference of convex optimization
Recently Published Documents


TOTAL DOCUMENTS

6
(FIVE YEARS 3)

H-INDEX

2
(FIVE YEARS 1)

2021 ◽  
pp. 1-36
Author(s):  
Ananda Theertha Suresh ◽  
Brian Roark ◽  
Michael Riley ◽  
Vlad Schogol

Abstract Weighted finite automata (WFA) are often used to represent probabilistic models, such as n-gram language models, since, among other things, they are efficient for recognition tasks in time and space. The probabilistic source to be represented as a WFA, however, may come in many forms. Given a generic probabilistic model over sequences, we propose an algorithm to approximate it as a weighted finite automaton such that the Kullback-Leibler divergence between the source model and the WFA target model is minimized. The proposed algorithm involves a counting step and a difference of convex optimization step, both of which can be performed efficiently.We demonstrate the usefulness of our approach on various tasks, including distilling n-gram models from neural models, building compact language models, and building open-vocabulary character models. The algorithms used for these experiments are available in an open-source software library.


2017 ◽  
Vol 169 (1) ◽  
pp. 119-140 ◽  
Author(s):  
Emilio Carrizosa ◽  
Vanesa Guerrero ◽  
Dolores Romero Morales

2013 ◽  
Vol 140 (1) ◽  
pp. 31-43 ◽  
Author(s):  
Mirjam Dür ◽  
Jean-Baptiste Hiriart-Urruty

Sign in / Sign up

Export Citation Format

Share Document