Data Flow
Tracing a single EDF file from disk to interactive visualization
This page walks through the complete journey of SEEG data through Montage Concord, from a raw EDF file all the way to the browser dashboard. Each step corresponds to a specific package and function call.
The Complete Pipeline
.tsv and .json sidecar files live alongside it._channels.tsv → status/annotations per channel,
_electrodes.tsv → MNI x/y/z coordinates,
_events.tsv → seizure onset times,
participants.tsv → subject demographics.
All of this is packed into a Recording object.
WelchPSD, LineLength) take a Recording and produce a MetricResult.
The metrics-utils package provides shared primitives: segment() slices the signal into overlapping windows,
bandpass() / notch() filter the signal.
The result's data array can be 1D (per-channel scalar), 2D (channels × frequencies), etc.
AppState in memory: the current Recording plus user preferences (montage, notch mode).
When the browser requests GET /api/timeseries, the route calls the viz function and
returns the dict as a JSON HTTP response. No files are written.
<div> elements. Multiple panels can be shown simultaneously
(time series, spectrogram, PSD, metric heatmaps, 3D brain). Zoom/pan events trigger
new API requests for the visible time window.
What Happens When You Click "Load"
Concretely, here is the sequence when you type a file path and press Load in the browser:
# 1. Browser sends HTTP POST with JSON body
POST /api/load
{"path": "/data/sub-HUP117/ieeg/sub-HUP117_run-01_ieeg.edf"}
# 2. Server reads the file
recording = read_bids_ieeg(path) # → Recording
set_recording(recording, path) # stores in AppState
# 3. Server responds with metadata
{
"channels": ["SEEG1", "SEEG2", ...],
"n_channels": 72,
"fs": 1024.0,
"duration": 300.0,
"montage": "monopolar",
"channel_metadata": {"SEEG1": {"status": "good", "x": 12.3, ...}},
"events": [{"onset": 45.2, "duration": 60.0, "label": "seizure"}]
}
# 4. Browser rebuilds sidebar, then requests time series
GET /api/timeseries?t_start=0&t_end=10&channels=SEEG1,SEEG2,...
# 5. Server calls viz
result = get_timeseries(recording, t_start=0, t_end=10, channels=[...])
# 6. Responds with JSON, browser renders Plotly chart
State Management: raw vs. processed Recording
The server keeps two Recording objects in state:
| Field | Contents | When updated |
|---|---|---|
| raw_recording | Original monopolar data from EDF, never touched after load | Only on POST /api/load |
| recording | Current view: montage + notch applied | On load, montage change, and notch change |
When the user switches from monopolar to bipolar, the server calls _rebuild_recording():
it takes raw_recording, applies the new montage, applies the current notch setting,
and stores the result as recording. All API routes then serve from this new recording.
Data Shapes Along the Pipeline
| Stage | Shape / type | Example (72 ch, 5 min at 1024 Hz) |
|---|---|---|
| Recording.data | (n_channels, n_samples) float64 | (72, 307200) |
| After bipolar | (n_pairs, n_samples) float64 | (66, 307200) |
| WelchPSD output | (n_channels, n_freqs) float64 | (72, 513) |
| LineLength output | (n_channels, n_windows) float64 | (72, 300) @ 1s windows |
| BandPower output | (n_channels, n_bands) float64 | (72, 6) |
| get_timeseries dict | lists (JSON-serializable) | 72 channels × ≤2000 points |
| Plotly trace | {x: [], y: [], ...} | One trace per channel |
The Notch Filter Chain
Powerline interference is very common in iEEG. The server exposes three notch modes:
| Mode | What it does | Best for |
|---|---|---|
| none | No filtering | Inspecting raw signal quality |
| notch50 | Zero-phase IIR notch at 50 Hz + 100 Hz | European recordings |
| notch60 | Zero-phase IIR notch at 60 Hz + 120 Hz | North American recordings |