Generalized Kullback-Leibler Divergence (CROSBI ID 658928)
Prilog sa skupa u zborniku | sažetak izlaganja sa skupa | međunarodna recenzija
Podaci o odgovornosti
Pečarić, Josip ; Pokaz, Dora
engleski
Generalized Kullback-Leibler Divergence
The Kullback-Leibler divergence is a measure of how one probability distribution diverges from another, expected probability distribution. It is also called relative entropy and K-L distance. However, the word distance is incorrectly used, since K-L divergence does not have all metric properties. It is an asymmetric measure and does not satisfy the triangle inequality. The Kullback-Leibler divergence is the only divergence which is a member of two classes, the class of f-divergences and the class of Bregman divergences. We emphasize its f-divergence belongings. That class of functions is introduced independently by Csiszar, Marimoto and Ali Silvey. Here, the key inequality is the Jensen inequality which relies on convex functions. We put K-L divergence in a context with L-Lipschitzian functions. For that case, we also adjust the Jensen inequality. In addition, we generalized K-L divergence by adding a weight factor and prove Jensen-type inequality for such generalized K-L divergence.
f-divergence, Jensen inequality, K-L divergence, Lipschitzian function, relative entropy
nije evidentirano
nije evidentirano
nije evidentirano
nije evidentirano
nije evidentirano
nije evidentirano
Podaci o prilogu
17IT120221
2017.
objavljeno
Podaci o matičnoj publikaciji
Podaci o skupu
ICMA 2017: 19th International Conference on Mathematics and Analysis, Roma, Italy
pozvano predavanje
11.12.2017-12.12.2017
Rim, Italija