New Lower and Upper Bounds for f-Divergences With Applications in Information Theory (CROSBI ID 673212)
Prilog sa skupa u zborniku | prošireni sažetak izlaganja sa skupa | međunarodna recenzija
Podaci o odgovornosti
Ivelić Bradanović, Slavica ; Pečarić, Đilda ; Pečarić, Josip
engleski
New Lower and Upper Bounds for f-Divergences With Applications in Information Theory
Csiszár introduced the concept of f-divergence functional as generalized measure of information on the set of probability distribution. We established a new lower and upper bound for f-divergence functional using some basic convexity facts. As special cases and corollaries of our bounds we establishe lower and upper bounds for some well-known entropies as well as Shannon's and relative entropy also known as the Kullback-Leibler divergence. As applications we also use the Zipf-Mandelbrot law to introduce a new entropy and to derive some new related results.
Sherman's inequality ; convex function ; Csiszár f-divergence, entropy ; Zipf-Mandelbrot law ; Zipf law
nije evidentirano
nije evidentirano
nije evidentirano
nije evidentirano
nije evidentirano
nije evidentirano
Podaci o prilogu
8-8.
2018.
objavljeno
Podaci o matičnoj publikaciji
Mathematical Inequalities and Applications 2018, Book of Abstracts.
Zagreb: Element
Podaci o skupu
Mathematical Inequalities and Applications, 2018.
predavanje
04.07.2018-08.07.2018
Zagreb, Hrvatska