Wednesday, August 5, 2015

Signal Normalization

Normalization is basically bringing the two signals to the same range or a predefined range. It is to do with removing scale factor effect. Thus we may rescale to normalize amplitude dynamic range to swing between +1/-1 or power to be unity or phase to be zero. A typical example of a predefined range is the statistical perception of the normalization, which is transforming the signal so that its mean is 0 and standard deviation is 1. After such transform the canonical form is obtained. Transforming all signals to such canonical form eases and robustifies the process of comparisons as well as serving too different needs such as visualization, and analysis.


when comparing two different signals that mean two different things (but equally important or with an equal significance regarding the information the signals carry), would you expect to be comparing two signals, one with values in the ballpark of ±1 and the other with no samples with magnitude at least 28 or so?


Normalization means that you're not comparing an elephant to a bug at least not with regard to mass. unless you make the bug look as big as the elephant, and then you start comparing what makes the bug different than the elephant.

No comments:

Post a Comment