(Untitled) | (Untitled) | (Untitled) | (Untitled) | (Untitled) | Examples |

IEVref: | 171-07-27 | ID: | |

Language: | en | Status: Standard | |

Term: | mean transinformation content | ||

Synonym1: | average transinformation content [Preferred] | ||

Synonym2: | |||

Synonym3: | |||

Symbol: | T(X,Y)
| ||

Definition: | mean value of the transinformation content T(x,_{i}x) of two events _{j}x and _{i}y, each in one of two finite sets of mutually exclusive and jointly exhaustive events _{j}$T\left(X,Y\right)={\displaystyle \sum _{i=1}^{n}{\displaystyle \sum _{j=1}^{m}p({x}_{i},{y}_{j})\cdot T({x}_{i},{y}_{j})}}$ where
i = 1, …, n), Y = {y_{1}, …, y_{m}} is the set of events y (_{j}j = 1, …, m) and p(x,_{i}y) the joint probability that both events occur_{j}Note 1 to entry: The mean transinformation content is symmetric in Note 2 to entry: The mean transinformation content is a quantitative measure of information transmitted through a channel, when | ||

Publication date: | 2019-03-29 | ||

Source: | IEC 80000-13:2008, 13-36, modified – Addition of information useful for the context of the IEV, and adaptation to the IEV rules | ||

Replaces: | |||

Internal notes: | |||

CO remarks: | |||

TC/SC remarks: | |||

VT remarks: | |||

Domain1: | |||

Domain2: | |||

Domain3: | |||

Domain4: | |||

Domain5: |

$T\left(X,Y\right)={\displaystyle \sum _{i=1}^{n}{\displaystyle \sum _{j=1}^{m}p({x}_{i},{y}_{j})\cdot T({x}_{i},{y}_{j})}}$

where

*X* = {*x*_{1}, …, *x*_{n}} is the set of events *x _{i}* (

Note 1 to entry: The mean transinformation content is symmetric in *X* and *Y*. It is also equal to the difference between the entropy of one of the two sets of events and the conditional entropy of this set relative to the other: $T\left(X|Y\right)=H(X)-H(X|Y)=H(Y)-H(Y|X)=T(X|Y)$.

Note 2 to entry: The mean transinformation content is a quantitative measure of information transmitted through a channel, when *X* is a specific set of messages at the message source and *Y* is a specific set of messages at the message sink. It is equal to the difference between the entropy at the message source and the equivocation, or the difference between the entropy at the message sink and the irrelevance.