llama-cpp-turboquant/tools/server/webui/src/lib/services
Aleksander Grygier 21f24f27a9
webui: Per-conversation system message with UI displaying, edition & branching (#17275)
* feat: Per-conversation system message with optional display in UI, edition and branching (WIP)

* chore: update webui build output
2025-12-06 13:19:05 +01:00
..
chat.ts webui: Per-conversation system message with UI displaying, edition & branching (#17275) 2025-12-06 13:19:05 +01:00
database.ts webui: Per-conversation system message with UI displaying, edition & branching (#17275) 2025-12-06 13:19:05 +01:00
index.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
models.ts Use OpenAI-compatible /v1/models endpoint by default (#17689) 2025-12-03 20:49:09 +01:00
parameter-sync.spec.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
parameter-sync.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
props.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00