Definition: | a process in which a continuous range of values that a quantity may assume is divided into a number of predetermined adjacent intervals, and in which any value within a given interval is represented by a single predetermined value within the interval
NOTE – Associated terms are "to quantize"; "quantizer".
|