Neural data API · beta

Neural data infrastructure.

The API layer for every neural device — EEG, EMG, fNIRS, ECoG, and beyond. Ingest raw signal, quality-gate every window, stream production features. Beta is EEG-first.

Trusted by researchers atHMS·MGH·CMU
raw_eeg_dump.csv
session.py
output
Run
7 data quality errors·256 Hz · MUSE_2 · 5 rowsraw_eeg_dump.csv
timestampTP9AF7AF8TP10
00:00.000820.5830.12847.3-199.4
00:00.004NaN831.4814.7825.2
00:00.008819.8828.9815.2NaN
00:00.0122941.2830.5-189.3825.9
00:00.016820.1NaN814.9824.6
3 NaN values2 spike artifactsmissing SQImissing artifact labelsmissing band decomp

The moment

The hardware arrived.
The infrastructure didn't.

Consumer EEG devices are proliferating — MUSE, OpenBCI, Emotiv, and dozens of others are putting neural sensing in the hands of developers and researchers. The hardware is getting cheaper, more accurate, and easier to use.

But the software stack hasn't kept up. Every team rebuilds the same pipelines — device drivers, signal ingestion, quality gating, artifact detection, feature extraction, real-time streaming. Undifferentiated work that takes 4–6 months and distracts from building actual products.

The BCI market is at an inflection point — moving from research labs to consumer products. This is the moment to build the infrastructure layer before thousands more teams waste months rebuilding it.

01 / Ingest

Any device, one endpoint.

POST raw signal batches from any neural device. Variable sample rates, arbitrary channel layouts, batch-level idempotency.

curl -X POST /v1/sessions/$SESSION_ID/ingest \
  -H "Authorization: Bearer $API_KEY" \
  -H "X-Project-Id: $PROJECT_ID" \
  -d '{"batch_id":"batch_001",
       "t0_unix_ms":1730000000000,
       "data":{"TP9":[820.5,821.2,...],"AF7":[...]}}'
02 / Quality

Know when to trust your data.

Per-channel SQI (0–1), blink, motion and EMG artifact detection. Bad channels flagged — never silently corrupting your model.

TP9

0.91

AF7

0.85

AF8

0.62

TP10

0.41

03 / Features

Five bands, per-channel confidence.

Delta, theta, alpha, beta, gamma bandpower plus spectral entropy and cross-band ratios — all with confidence scores.

alpha
72%
theta
42%
beta
31%
delta
60%
gamma
10%
04 / Stream

Sub-100ms, no polling.

Dedicated WebSocket per session. Feature windows publish as computed — typically within 12ms of the last ingest batch.

const ws = new WebSocket(session.ws_url);

ws.onmessage = ({ data }) => {
  const msg = JSON.parse(data);
  if (msg.type === "features") {
    const alpha = msg.data.features.bandpower.alpha;
    const sqi   = msg.data.quality.overall;  // 0–1
  }
};
05 / Transform

Raw signal in. Structured data out.

Every device has its own SDK, format, and quirks. Voxel normalizes all of it — one API call returns the same shape regardless of hardware.

Without Voxel
MUSE 2
import muselsl
stream = muselsl.stream("your_muse_mac")
# Returns: raw LSL stream
# Format:  device-specific
# Quality: none
# Latency: you figure it out
OpenBCI
board = BoardShim(BoardIds.CYTON_BOARD, params)
board.start_stream()
# Returns: numpy array, 24-bit
# Format:  board-specific CSV
# Quality: none — artifacts yours
Emotiv
headset = CortexClient(client_id, secret)
await headset.subscribe(["eeg"])
# Returns: vendor JSON with proprietary
#          channel names and scaling
# Quality: none
With Voxel — any device
GET /v1/sessions/{id}/features/latest200 11ms
{
  "session_id":  "ses_4f2a9b8c",
  "window_id":   "w_00342",
  "device":      "MUSE_2",
  "t_ms":        2000,
  "bandpower": {
    "TP9":  { "alpha": 15.7, "sqi": 0.91 },
    "AF7":  { "alpha": 11.2, "sqi": 0.85 },
    "AF8":  { "alpha": 13.8, "sqi": 0.79 },
    "TP10": { "alpha": 18.3, "sqi": 0.62 }
  },
  "artifacts": { "blink": false, "emg": false },
  "latency_ms": 11
}

Same response schema whether your device is a MUSE, OpenBCI, Emotiv, or custom hardware. SQI, bandpower, and artifact flags included on every window.

< 47ms

Avg latency

5

Freq bands

256 Hz

Max sample rate

99.9%

Uptime SLA

Start building on neural data.

Free tier up to 100 sessions / month. EEG support live now — EMG, fNIRS and additional devices on the roadmap.

quickstart.py