Neatlogs
Features

AI Search

Search across your agent traces using plain English.

AI Search queries your traces in natural language. Instead of building filters manually, describe what you're looking for and Neatlogs finds it — searching across span inputs, outputs, token counts, latency, attributes, and metadata across all your runs.

It comes in two modes:

  • Fast — optimized for speed. Good for straightforward queries where you know roughly what you're looking for.
  • Pro — deeper reasoning over your traces. Better for complex or ambiguous queries where the answer requires understanding context across multiple spans.

What you can search for

AI Search understands your traces, not just their metadata. You can query by content, behavior, cost, latency, or any combination:

"Show me traces where the agent failed to call a tool"
"Find runs where the response mentioned a refund"
"Which traces had more than 5 LLM calls?"
"Show me sessions from last week where the output was empty"
"Find traces where token count exceeded 10,000"
"Which runs had a latency above 8 seconds?"
"Show me traces where the retriever returned no documents"
"Find any run where the guardrail blocked the response"

Follow-up questions

AI Search is stateful within a session. Once you have a result set, ask a follow-up to drill into it without restating the context.

"Of those, which ones had the highest cost?"
"Filter to only the ones from the last 24 hours"
"Which of these had a tool call that returned an error?"
"Show me the one with the longest LLM span"

This makes AI Search useful for investigations, not just one-off lookups. Start broad, narrow down through follow-ups, and land on exactly the traces you need.