llama-cpp-turboquant/tools/server/webui/src/lib/services
Pascal 1faa13a118
webui: updated the chat service to only include max_tokens in the req… (#16489)
* webui: updated the chat service to only include max_tokens in the request payload when the setting is explicitly provided, while still mapping explicit zero or null values to the infinite-token sentinel

* chore: update webui build output
2025-10-09 22:54:57 +02:00
..
chat.ts webui: updated the chat service to only include max_tokens in the req… (#16489) 2025-10-09 22:54:57 +02:00
context.ts SvelteKit-based WebUI (#14839) 2025-09-17 19:29:13 +02:00
index.ts SvelteKit-based WebUI (#14839) 2025-09-17 19:29:13 +02:00
slots.ts webui: switch to hash-based routing (alternative of #16079) (#16157) 2025-09-26 18:36:48 +03:00