Add a flexible logging system to the llm-client package that can be
enabled/disabled without rebuilding:
- Add Logger class with configurable enable/disable and custom logger support
- Add LogLevel, LoggerFn, LoggingConfig types
- Add `debug` option to LLMStreamRequest for per-request logging override
- Add setLogging() method for runtime enable/disable
- Replace hardcoded console.log in openai-responses provider with logger
- Add ?debug=true query param to flowchart generate endpoint
Usage:
- Per-request: llm.stream({ ..., debug: true })
- Global: llm.setLogging({ enabled: true })
- Custom logger: new LLMClient({ logging: { enabled: true, logger: fn } })
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
||
|---|---|---|
| .. | ||
| abacus-react | ||
| core | ||
| llm-client | ||
| templates | ||