Please use this identifier to cite or link to this item:
http://hdl.handle.net/10603/380968
Title: | Generalized information and divergence measures with ther applications |
Researcher: | Aakansha, Singhal |
Guide(s): | Sharma, Dilip Kumar |
Keywords: | Mathematics Physical Sciences |
University: | Jaypee University of Engineering and Technology, Guna |
Completed Date: | 2022 |
Abstract: | The research work presented in the thesis aims at widening the concept of generalized entropies and divergence measures for random variables with associated utility distribution along with the probability distribution. The thesis emphasises the concept of quasilinear mean and extends the concept for probability distributions with associated weights or utility distribution. The concept of Weighted Quasi-linear Mean deand#64257;ned in the thesis has been explored to derive both existing and new generalized useful entropies. This concept can further be exploited to derive generalized useful entropy corresponding to any generalized entropy that can be observed as a quasi-linear mean of information. The thesis also reveals how Supra-extensive entropy can be observed as the quasi-linear mean of generalized information. newlineThe research work also focuses on generalizing Divergence Measures for probability distri¬butions with associated utilities. The new generalized divergence measures introduced are: useful Taneja divergence, useful Rényi divergence, useful Csiszár f -divergence, useful Rényi h-divergence, useful (h,and#968;) Jeffery-Rényi divergence and, useful (h,and#968;) Jensen-Rényi divergence measure. newlineThe research work emphasises the following applications based on generalized entropies and divergence measures: newline Rainfall Data Analysis using useful Rényi entropy and useful Tsallis entropy: The objective of the research work is to develop a methodology that could assist farmers in aligning their agricultural activities as per the rainfall forecast with aim of maximizing their productivity. newline Keyword Extraction using Rényi Entropy: The research work aims at extracting key¬words using word ranking parameter based on Rényi entropy. The proposed method¬ology being a statistical one, is both language and domain independent. The proposed methodology was viable in extracting keywords from the given text. newline Keyword Extraction using Tsallis Relative Entropy: The objective of this research is to examine the performance of keyword extraction methodology based on Tsallis relative entropy. Reliable performance in keyword extraction is accomplished using an entropic index of q = 3. This method is also language and domain independent and hence could assist in various applications like incremental clustering and contriving dynamic text collection. newline COVID-19 Data Analysis using Shannon s Entropy: The research work analysed the impact of lock-down on conand#64257;rmed cases in three countries -Brazil, India and, USA, with the highest number of COVID-19 cases using Shannon s entropy. newline newline |
Pagination: | xviii; 145p. |
URI: | http://hdl.handle.net/10603/380968 |
Appears in Departments: | Department of Mathematics |
Files in This Item:
Items in Shodhganga are licensed under Creative Commons Licence Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).
Altmetric Badge: