profile-img
Eduncle posted an MCQ
October 18, 2019 • 15:07 pm 0 points
  • UGC NET
  • Library and Information Science

Entropy in the context of Information Science is :

(a) Measure which cannot be normalised by dividing it by information length. (b) Measure of unpredictability of the state or equivalently, of its average information content. (c) Average amount of information produced by a probabilistics stochastic source of data. (d) Way to measure diversity. Code :

Choose Your Answer:
0 Attempts Submit Now
  • 0 Likes
  • 1 Comments
  • 0 Shares
  • comment-profile-img>
    Eduncle Best Answer

    Information entropy is defined as the average amount of information produced by a stochastic source of data. The measure of information entropy associated with each possible data value is the negative logarithm of the probability mass function for the value. Thus, when the data source has a lower-probability value (i.e., when a low-probability event occurs), the event carries more "information" ("surprisal") than when the source data has a higher-probability value. Entropy is a measure of unpredictability of the state, or equivalently, of its average information content. To get an intuitive understanding of these terms, consider the example of a political poll. Usually, such polls happen because the outcome of the poll is not already known. Entropy is one of several ways to measure diversity. Specifically, Shannon entropy is the logarithm of 1D, the true diversity index with parameter equal to 1.

whatsapp-btn

Do You Want Better RANK in Your Exam?

Start Your Preparations with Eduncle’s FREE Study Material

  • Updated Syllabus, Paper Pattern & Full Exam Details
  • Sample Theory of Most Important Topic
  • Model Test Paper with Detailed Solutions
  • Last 5 Years Question Papers & Answers