Complete reference for the spikedetect Python package -- spike detection and sorting for electrophysiology recordings.
If you're new, start with the Getting Started guide first.
- How the Pipeline Works
- Parameters Reference
- Using Individual Pipeline Stages
- Interactive GUIs
- File I/O
- Batch Processing Multiple Recordings
- Understanding the Output
- Tips for Getting Good Results
- API Quick Reference
spikedetect finds spikes in five steps. Each step is a separate class, and they are chained together by the SpikeDetector.detect() orchestrator.
Raw voltage
|
v
[1. SignalFilter] -- Butterworth bandpass + optional differentiation
|
v
[2. PeakFinder] -- Find candidate peaks above threshold
|
v
[3. TemplateMatcher] -- DTW distance + amplitude for each candidate
|
v
[4. Threshold] -- Keep candidates with low DTW distance + high amplitude
|
v
[5. InflectionPointDetector] -- Correct spike timing via 2nd derivative
|
v
Spike times (sample indices)
Applies a 3rd-order Butterworth high-pass filter, then a 3rd-order low-pass filter to isolate the spike frequency band. Optionally differentiates the signal (1st or 2nd derivative) and flips polarity.
Key detail: Uses causal filtering (scipy.signal.lfilter), not zero-phase filtering (filtfilt). This matches the original MATLAB filter() function behavior. Causal filtering introduces a small phase delay, but this is corrected later by the inflection point timing correction.
Finds local maxima in the filtered signal using scipy.signal.find_peaks. Peaks must be:
- Above
mean(signal) + peak_threshold - At least
fs / 1800samples apart (prevents detecting the same spike twice) - Far enough from the recording edges to extract a full template window
For each candidate peak:
- Extracts a waveform window centered on the peak
- Min-max normalizes the waveform to [0, 1]
- Computes Dynamic Time Warping (DTW) distance against the spike template
- Computes spike amplitude using a projection method based on the inflection point
The DTW uses squared-Euclidean local cost (a[i] - b[j])^2, not absolute difference. This matches the original MATLAB implementation and produces different results than standard DTW libraries.
Keeps candidates where:
DTW_distance < distance_threshold(shape matches template)amplitude > amplitude_threshold(large enough to be a spike)
Refines each spike's timing by finding the inflection point (the steepest rise) on the unfiltered waveform. This corrects for the phase delay introduced by causal filtering.
Uses the peak of the smoothed 2nd derivative, looking near the expected likely_inflection_point_peak location.
All detection parameters are stored in SpikeDetectionParams. Here is every field with its meaning, range, and how to tune it.
| Parameter | Type | Default | Description |
|---|---|---|---|
fs |
float | (required) | Sample rate in Hz. Must match your recording. |
hp_cutoff |
float | 200.0 | High-pass filter cutoff in Hz. Removes slow drifts. Increase to isolate fast transients. Must be < fs/2. |
lp_cutoff |
float | 800.0 | Low-pass filter cutoff in Hz. Removes high-frequency noise. Decrease for cleaner but blunter spikes. Must be < fs/2. |
diff_order |
int | 1 | Differentiation after filtering. 0 = none, 1 = first derivative (recommended -- enhances spike peaks), 2 = second derivative (sharper but noisier). |
polarity |
int | 1 | Multiply signal by this value. Use +1 for upward spikes, -1 for downward spikes. |
| Parameter | Type | Default | Description |
|---|---|---|---|
peak_threshold |
float | 5.0 | Peak detection sensitivity. Candidates must exceed mean(filtered) + peak_threshold. Lower = more candidates (more sensitive). Higher = fewer candidates (more selective). |
spike_template |
ndarray or None | None | The reference spike waveform. Must be set before detection. Obtained from TemplateSelectionGUI or a previous run. 1-D numpy array. |
spike_template_width |
int | auto | Template half-width in samples. Auto-computed as round(0.005 * fs) + 1 if not set. Updated to match template length during validation. |
distance_threshold |
float | 15.0 | Maximum DTW distance to accept a candidate. Lower = stricter (fewer false positives, may miss unusual spikes). Higher = looser (catches more spikes, more false positives). |
amplitude_threshold |
float | 0.2 | Minimum spike amplitude (projection value). Lower = catches smaller spikes. Can be negative for certain signal polarities. |
| Parameter | Type | Default | Description |
|---|---|---|---|
likely_inflection_point_peak |
int or None | None | Expected inflection point index within the spike window. Auto-computed during detection. Rarely needs manual setting. |
last_filename |
str | "" | Name of the last file processed. Used for tracking. |
import spikedetect as sd
# Auto-scaled defaults (recommended)
params = sd.SpikeDetectionParams.default(fs=50000)
# Manual creation with all fields
params = sd.SpikeDetectionParams(
fs=50000,
hp_cutoff=800,
lp_cutoff=160,
diff_order=1,
polarity=-1,
peak_threshold=7.6e-5,
distance_threshold=9.6,
amplitude_threshold=-0.08,
spike_template=template_array,
)
# Validate (checks ranges, updates computed fields)
params = params.validate()Each pipeline stage is a class with static methods. You can use them independently for custom workflows.
from spikedetect import SignalFilter
filtered = SignalFilter.filter_data(
voltage, # 1-D numpy array
fs=50000, # sample rate
hp_cutoff=800, # high-pass cutoff Hz
lp_cutoff=160, # low-pass cutoff Hz
diff_order=1, # 0, 1, or 2
polarity=-1, # +1 or -1
)from spikedetect import PeakFinder
locs = PeakFinder.find_spike_locations(
filtered, # filtered signal
peak_threshold=7.6e-5, # minimum peak height
fs=50000, # sample rate
spike_template_width=251, # template width in samples
)
# locs is a 1-D int64 array of sample indicesfrom spikedetect import DTW
distance, warped_a, warped_b = DTW.warping_distance(waveform_a, waveform_b)
# distance: float (accumulated squared-Euclidean cost)
# warped_a, warped_b: aligned versions of the inputsfrom spikedetect import TemplateMatcher
result = TemplateMatcher.match(
spike_locs=locs, # from PeakFinder
spike_template=template, # 1-D template waveform
filtered_data=filtered, # filtered signal
unfiltered_data=voltage, # raw voltage
spike_template_width=251, # template width
fs=50000, # sample rate
)
# result.dtw_distances: DTW distance per candidate
# result.amplitudes: amplitude per candidate
# result.spike_locs: locations (may be shorter than input due to edge removal)from spikedetect import SpikeClassifier
good, weird, weirdbad, bad = SpikeClassifier.classify(
distances=result.dtw_distances,
amplitudes=result.amplitudes,
distance_threshold=9.6,
amplitude_threshold=-0.08,
)
# Each is a boolean mask over candidatesfrom spikedetect import WaveformProcessor
# Smooth and differentiate a waveform
smoothed = WaveformProcessor.smooth_and_differentiate(waveform, stw=251, fs=50000)
# Just differentiate
deriv = WaveformProcessor.differentiate(waveform)Four interactive tools for visual parameter tuning. All use Matplotlib and work in both standalone Python and Jupyter notebooks. The GUIs chain together in a specific order -- each one produces output needed by the next.
Raw voltage + default params
|
v
[FilterGUI] -----> updated params (hp/lp cutoffs, diff order, polarity)
|
v
Filter the data (SignalFilter.filter_data)
|
v
[TemplateSelectionGUI] -----> spike_template (1-D waveform)
|
v
Run detect_spikes() -----> initial SpikeDetectionResult
|
Also run PeakFinder + TemplateMatcher manually to get match_result
|
v
[ThresholdGUI] -----> updated distance_threshold, amplitude_threshold
|
v
Re-run detect_spikes() with updated thresholds
|
v
[SpotCheckGUI] -----> final reviewed SpikeDetectionResult
For complete runnable examples, see:
examples/gui_workflow.py-- full workflow with synthetic data (no files needed)examples/gui_workflow_from_file.py-- same workflow loading from a .mat or .abf file
Sliders for HP cutoff, LP cutoff, peak threshold. Radio buttons for diff order. Button for polarity toggle. Shows the filtered signal updating in real time.
Controls:
- HP slider (0.5--1000 Hz): High-pass cutoff. Move right to remove more low-frequency drift.
- LP slider (0.11--1000 Hz): Low-pass cutoff. Move left to remove high-frequency noise.
- Peak thresh slider (log10 scale): Controls candidate detection sensitivity.
- Diff radio buttons (0/1/2): Differentiation order. 1 is usually best.
- Pol button: Click to flip signal polarity (+1 or -1).
- Enter: Accept settings and close.
What to look for: Red dots on peaks should align with your spikes. Adjust until most spikes have a dot and noise does not.
from spikedetect.gui import FilterGUI
filter_gui = FilterGUI(voltage, params)
params = filter_gui.run() # blocks until you press Enter or close
# params is updated with your chosen filter settingsNext step: Filter the data with the chosen parameters before using TemplateSelectionGUI:
from spikedetect.pipeline.filtering import SignalFilter
start_point = round(0.01 * params.fs)
unfiltered_data = rec.voltage[start_point:]
filtered_data = SignalFilter.filter_data(
unfiltered_data, fs=params.fs,
hp_cutoff=params.hp_cutoff, lp_cutoff=params.lp_cutoff,
diff_order=params.diff_order, polarity=params.polarity,
)Displays the filtered signal with detected peaks marked as red dots. Click on peaks to select seed spikes -- they turn green when selected. The bottom panel shows overlaid waveforms of your selections. When you press Enter, the GUI averages your selections (with cross-correlation alignment) to build the spike template.
Controls:
- Click on red dot: Select that spike as a seed (turns green).
- Shift+click: Select additional spikes.
- Enter: Build template from selections and close.
Tips:
- Select 3--5 clear, well-isolated spikes for the best template.
- Avoid spikes that overlap with other spikes or artifacts.
- The template should represent a "typical" spike, not the largest.
from spikedetect.gui import TemplateSelectionGUI
template_gui = TemplateSelectionGUI(filtered_data, params)
template = template_gui.run()
params.spike_template = templateScatter plot of DTW distance vs. amplitude for all candidates. Click to move the threshold lines. Color-coded waveform panels show what's accepted (blue) vs. rejected (red/yellow). The bottom panel shows the mean accepted spike waveform.
Controls:
- Click on scatter plot: Move the active threshold line to the click position.
- b: Toggle between adjusting distance threshold (vertical line) and amplitude threshold (horizontal line). The active threshold is cyan; the inactive one is red.
- Click on mean waveform panel: Update the spike template to the mean of currently accepted waveforms.
- Enter: Accept thresholds and close.
What to look for: Good spikes cluster at low distance and high amplitude (lower-left). Noise clusters at high distance (upper-right). Position the threshold lines to separate these clusters.
Important: ThresholdGUI requires a TemplateMatcher result, not just a detection result. You need to run the pipeline stages manually:
from spikedetect.pipeline.peaks import PeakFinder
from spikedetect.pipeline.template import TemplateMatcher
from spikedetect.gui import ThresholdGUI
spike_locs = PeakFinder.find_spike_locations(
filtered_data, peak_threshold=params.peak_threshold,
fs=params.fs, spike_template_width=params.spike_template_width,
)
match_result = TemplateMatcher.match(
spike_locs=spike_locs,
spike_template=params.spike_template,
filtered_data=filtered_data,
unfiltered_data=unfiltered_data,
spike_template_width=params.spike_template_width,
fs=params.fs,
)
threshold_gui = ThresholdGUI(match_result, params)
params = threshold_gui.run()Step through each detected spike with a detailed multi-panel view showing the unfiltered trace with spike markers, filtered trace, spike context window with mean waveform and 2nd derivative overlay, and filtered context with template overlay.
Keyboard controls:
- y: Accept this spike and advance to next
- n: Reject this spike (remove from results) and advance
- Right arrow: Shift spike position right by 10 samples
- Shift+Right: Shift spike position right by 1 sample
- Left arrow: Shift spike position left by 10 samples
- Shift+Left: Shift spike position left by 1 sample
- Tab: Skip to next spike without decision
- Shift+Tab: Go back to previous spike
- Enter: Finish review and close
from spikedetect.gui import SpotCheckGUI
spotcheck = SpotCheckGUI(recording, result)
result = spotcheck.run()
# result.spot_checked is now True
# result.spike_times reflects accepted/adjusted spikes- ThresholdGUI needs
match_result, notresult: The ThresholdGUI scatter plot shows DTW distances and amplitudes fromTemplateMatcher.match(). You cannot pass aSpikeDetectionResultdirectly. See the code example above. - TemplateSelectionGUI needs filtered data: Pass
filtered_data(output ofSignalFilter.filter_data), not raw voltage. The GUI uses the filtered signal to find and display candidate peaks. - Forgot to filter before template selection: After using FilterGUI to choose parameters, you must explicitly call
SignalFilter.filter_data()to producefiltered_datafor TemplateSelectionGUI. - Template is None: If you close TemplateSelectionGUI without clicking any peaks,
template_gui.run()returnsNone. Always check the return value.
For Jupyter, add this at the top of your notebook:
%matplotlib widgetThis requires ipympl: pip install ipympl
Supports MATLAB v5, v7, and v7.3 (HDF5) formats. Automatically detects the format.
import spikedetect as sd
# Load a trial .mat file
rec = sd.load_recording("trial.mat")
# rec.name, rec.voltage, rec.sample_rate, rec.current (if available)
# rec.result (SpikeDetectionResult if the file had previous detection results)
# Save results back
from spikedetect.io.mat import save_result
rec.result = result
save_result("trial_updated.mat", rec)Requires pyabf: pip install pyabf
rec = sd.load_abf("recording.abf")Compact format with gzip compression. Good for long-term storage.
from spikedetect.io.native import save_native, load_native
save_native("output.h5", recording)
rec = load_native("output.h5")Save and load detection parameters as JSON files in ~/.spikedetect/.
from spikedetect.io.config import save_params, load_params
save_params(params, "my_experiment")
params = load_params("my_experiment")import spikedetect as sd
from pathlib import Path
import logging
logging.basicConfig(level=logging.INFO)
# Load template from a previous run
params = sd.SpikeDetectionParams.default(fs=50000)
params.spike_template = np.load("my_template.npy")
# Process all .mat files in a directory
data_dir = Path("data/experiment_001")
results = {}
for mat_file in sorted(data_dir.glob("*.mat")):
rec = sd.load_recording(mat_file)
result = sd.detect_spikes(rec, params)
results[rec.name] = result
print(f"{mat_file.name}: {result.n_spikes} spikes")
# Summary
total = sum(r.n_spikes for r in results.values())
print(f"\nTotal: {total} spikes across {len(results)} recordings")result = sd.detect_spikes(rec, params)
result.spike_times # int64 array -- 0-based sample indices
result.spike_times_uncorrected # before inflection point correction
result.spike_times_seconds # float array -- times in seconds
result.n_spikes # number of spikes found
result.params # the params used for this detection
result.spot_checked # True if manually reviewedprint(result.summary())
# Spike Detection Result
# Spikes found: 296
# Time range: 0.162 - 7.882 s
# Mean ISI: 26.2 ms (range 5.3 - 112.4 ms)
# Mean firing rate: 38.2 Hz
# Spot-checked: nodf = result.to_dataframe()
# Columns: spike_index, spike_time_s, spike_index_uncorrected
df.to_csv("spikes.csv", index=False)# Plot voltage trace with spike markers
rec.result = result
fig = rec.plot(show_spikes=True)
import matplotlib.pyplot as plt
plt.show()- Start with
SpikeDetectionParams.default(fs=your_rate)-- the defaults work for most recordings - If your spikes are in a specific frequency band, adjust
hp_cutoffandlp_cutoffaccordingly diff_order=1(first derivative) usually gives the cleanest peaks- If your spikes are downward-going, set
polarity=-1
- Select 3-5 clear, well-isolated spikes in the
TemplateSelectionGUI - Avoid selecting spikes that overlap with other spikes or artifacts
- The template should represent a "typical" spike, not the largest or smallest
- Use the
ThresholdGUIscatter plot to see the distribution - Good spikes cluster in the lower-left (low distance, high amplitude)
- Noise/artifacts cluster in the upper-right
- Set thresholds to separate these clusters
- Always spot-check your final results for publication-quality data
- SpotCheck lets you manually accept/reject individual spikes
- Especially important for borderline cases near the threshold boundaries
- If you have previous MATLAB results, load the .mat file and compare spike counts
- The Python pipeline produces identical results (verified: 296/296 match on test data)
import spikedetect as sd
sd.detect_spikes(recording, params) # Run full pipeline
sd.load_recording("file.mat") # Load .mat file
sd.load_abf("file.abf") # Load ABF file
rec.result = res
sd.save_result("file.mat", rec) # Save results to .mat| Class | Module | Purpose |
|---|---|---|
Recording |
spikedetect.models |
Holds voltage data + metadata |
SpikeDetectionParams |
spikedetect.models |
All detection parameters |
SpikeDetectionResult |
spikedetect.models |
Detection output (spike times) |
SpikeDetector |
spikedetect.pipeline.detect |
Full pipeline orchestrator |
SignalFilter |
spikedetect.pipeline.filtering |
Butterworth bandpass filter |
PeakFinder |
spikedetect.pipeline.peaks |
Peak detection |
TemplateMatcher |
spikedetect.pipeline.template |
DTW template matching |
DTW |
spikedetect.pipeline.dtw |
Dynamic Time Warping distance |
InflectionPointDetector |
spikedetect.pipeline.inflection |
Spike timing correction |
SpikeClassifier |
spikedetect.pipeline.classify |
Quality classification |
WaveformProcessor |
spikedetect.utils |
Smoothing/differentiation |
FilterGUI |
spikedetect.gui |
Filter parameter tuning |
TemplateSelectionGUI |
spikedetect.gui |
Spike template selection |
ThresholdGUI |
spikedetect.gui |
Threshold adjustment |
SpotCheckGUI |
spikedetect.gui |
Manual spike review |