Skip to content

Negative peaks dominate latency calculations #20

@jsiegle

Description

@jsiegle

I would like to use the zetatest function to calculate the onset latency of a neuron's response to an optogenetic stimulus. However, the latency values that are being returned (e.g. dRate[dblLatencyPeak], dRate[dblLatencyPeakOnset]) do not make sense. Eventually I tracked this down to the fact that the negative peaks in the z-scored firing rate profile are often more prominent than the positive ones. This can be seen in the example below, where the original and inverted z-scored time series are plotted along with the peaks found by scipy.stats.find_peaks. The vertical blue line represents the most prominent peak, which is being detected as the first peak in the negative z-scored time series. Note that I restricted the temporal range between 0.002 and 0.015 s, which is where we expect the peak to occur.

Image

I was able to correct this by ignoring the negative peaks in the getPeak function. After that, the returned latency values were more sensible.

Is my fix reasonable for cases where we know the firing rate should increase? Is there a way to apply the same constraints using the existing package?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions