IEVref: 171-07-15 ID: Language: en Status: Standard Term: entropy, Synonym1: average information content [Preferred] Synonym2: negentropy [Deprecated] Synonym3: Symbol: H(X) Definition: mean value of the information content of the events in a finite set of mutually exclusive and jointly exhaustive events $H=\sum _{i=1}^{n}p\left({x}_{i}\right)\cdot I\left({x}_{i}\right)=\sum _{i=1}^{n}p\left({x}_{i}\right)\cdot \mathrm{log}\left(\frac{1}{p\left({x}_{i}\right)}\right)$ where $X=\left\{{x}_{1},\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}{x}_{n}\right\}$ is the set of events ${x}_{i}\text{\hspace{0.17em}}\left(i=1,\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}n\right)$, $I\left({x}_{i}\right)$ are their information contents and $p\left({x}_{i}\right)$ the probabilities of their occurrences, subject to $\sum _{i=1}^{n}p\left({x}_{i}\right)=1$EXAMPLE Let $\left\{a,b,c\right\}$ be a set of three events and let $p\left(a\right)=0,5$, $p\left(b\right)=0,25$ and $p\left(c\right)=0,25$ be the probabilities of their occurrences. The entropy of this set is $H\left(X\right)=p\left(a\right)\cdot I\left(a\right)+p\left(b\right)\cdot I\left(b\right)+p\left(c\right)\cdot I\left(c\right)=1,5\text{\hspace{0.17em}}\mathrm{Sh}$. Publication date: 2019-03-29 Source: IEC 80000-13:2008, 13-25, modified – Addition of information useful for the context of the IEV, and adaptation to the IEV rules Replaces: Internal notes: CO remarks: TC/SC remarks: VT remarks: Domain1: Domain2: Domain3: Domain4: Domain5: