Conversation
Chat with llama-server models in real time
Idle
No request queued.

Thought Monitor
Live reasoning stream

Thought tokens will appear here when supported by the model.

Session Settings
Stored only for this browser session and cleared when the tab closes.

Usage Analytics
Token estimates per response
No completions yet.
Session Settings
Conversation Log