Skip to content

Variance calculation in KTGainVariationProcessor is not done correctly #159

@nsoblath

Description

@nsoblath

This only applies to the situation where a spectrum has been supplied without a variance data object.

The variance calculation should take into account that the mean changes as a function of frequency. Currently that is not done (well, it's sort of done for real values; the mean for a given window is the running sum in that window).

For real values, what we need to do is calculate the gain variation mean first, then supply that to the calculation of the variance, and have it subtracted off for each bin. Here, for instance, is the variance calculation from Katydid v2.9.2's KTVariableSpectrumDiscriminator::DiscriminateSpectrum(KTPowerSpectrum...) (line 570ff):

            for (unsigned iBin=fMinBin; iBin<=fMaxBin; ++iBin)
            {
                diff = (*spectrum)(iBin) - (*splineImp)(iBin - fMinBin);
                sigma += diff * diff;
            }

For complex values, I'm not sure what the right procedure is. In Katydid v2.9.2's KTVariableSpectrumDiscriminator::DiscriminateSpectrum(KTFrequencySpectrumFFTW...) (lines 489ff) it uses the magnitudes of the values. However, one can find the variance of a complex number: Var[Z] = E[|Z|^2] - |E[Z]|^2. It's just that I'm not sure what to do when E[Z(f)] changes as a function of f. Here's the relevant code from v2.9.2, though, should it prove useful:

            for (unsigned iBin=fMinBin; iBin<=fMaxBin; ++iBin)
            {
                fMagnitudeCache[iBin] = sqrt((*spectrum)(iBin)[0] * (*spectrum)(iBin)[0] + (*spectrum)(iBin)[1] * (*spectrum)(iBin)[1]);
                diff = fMagnitudeCache[iBin] - (*splineImp)(iBin - fMinBin);
                sigma += diff * diff;
            }

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions