Determining the variance boundaries of single-mode distributions using power entropy
Variance and entropy are distinct metrics that are commonly used to measure the uncertainty of random variables. While the variance shows how a random variable spreads more than expected, the entropy measure measures the uncertainty of an information approach, in other words, it measures the average amount of information of a random variable.For both uniform and normal distributions, variance is a ratio of power entropy. Finding such a monotonic relationship between variance and entropy for a larger class of these two distributions is very important and useful in signal processing, machine learning, information theory, probability and statistics, for example, it is used to reduce the errors of estimators and choose a strategy. gives, on average, the greatest or nearly greatest reduction in the entropy of the distribution of the target location, and the effectiveness of this method is tested using simulations with mining assay models. In this article, the upper bound of the variance for single-mode distributions whose tails are heavier than the tails of exponential distributions is created with the help of power entropy
-
Estimating Kendall's τ when both times are subject to interval censoring
Fatemeh Ghanbari, *
Journal of Statistical Modelling: Theory and Applications, Summer and Autumn 2023 -
Goodness-of-fit test for randomly censored data
Seyed Mahdi Amir Jahanshahi*, Mohammadhosein Dehqan, Yasser Hashemzehi
Andishe-ye Amari, -
Investigating the effectiveness of reverse learning education on the academic hope and academic vitality of first secondary school boys in experimental sciences
Manije Saneitabass*, Khadije Saneitabass, Zahra Gavahi
Journal of research On Issues of Education,