Neatlogs
Quickstart

Explore the Dashboard

A tour of what Neatlogs shows you once traces are coming in.

Click through the sidebar to explore each section of the Neatlogs dashboard.

Tip: this demo is fully click-driven, so users can freely move between sections at their own pace.

What you're seeing

AI Search — Ask questions about your traces in plain English. Search across inputs, outputs, latency, token counts, and metadata without building filters. Supports follow-up questions to drill down without restating context.

Traces — Every LLM call, tool invocation, and retrieval step captured as a span tree. Click any row to see the exact prompt the model received, token breakdown, and a timeline showing what ran in parallel and where time was lost.

Detections — Anomalies Neatlogs surfaces automatically: token spikes, error rate jumps, latency regressions. Scoped per workflow so you only see what's relevant.

Evals — Aggregate quality metrics computed from the votes you leave on spans while debugging in Traces. No separate labelling step — production feedback feeds directly into scores.

On this page