Time management is very much important in IIT JAM. The eduncle test series for IIT JAM Mathematical Statistics helped me a lot in this portion. I am very thankful to the test series I bought from eduncle.
Nilanjan Bhowmick AIR 3, CSIR NET (Earth Science)
>


Eduncle Best Answer
Information entropy is defined as the average amount of information produced by a stochastic source of data. The measure of information entropy associated with each possible data value is the negative logarithm of the probability mass function for the value. Thus, when the data source has a lower-probability value (i.e., when a low-probability event occurs), the event carries more "information" ("surprisal") than when the source data has a higher-probability value. Entropy is a measure of unpredictability of the state, or equivalently, of its average information content. To get an intuitive understanding of these terms, consider the example of a political poll. Usually, such polls happen because the outcome of the poll is not already known. Entropy is one of several ways to measure diversity. Specifically, Shannon entropy is the logarithm of 1D, the true diversity index with parameter equal to 1.