mirror of
https://github.com/vegu-ai/talemate.git
synced 2025-12-16 03:37:51 +01:00
* TabbyAPI Client Addition and presets refactoring (#126) * feat: frequency_penalty (will make tabbyAPI custom wrapper) * feat: add FREQUENCY_PENALTY_BASE and adj. conversation template * feat: use `client_type` of `openai_compat` to send FIXED preset * change from client name * feat: pass client_type into presets.configure(...) * wip: base TabbyAPI client * feat: add import to register TabbyAPI client * feat: adjust `presence_penalty` so it has a range of 0.1-0.5 (higher values will likely degrade performance) * feat: add additional samplers/settings for TabbyAPI * feat: keep presence_penalty in a range of 0.1-0.5 * feat: keep min_p in a range of 0.05 to 0.15 * update tabbyapi.py * feat: add MIN_P_BASE and TEMP_LAST and change to tabbyapi client only for now * fix: add /v1 as default API route to TabbyAPI * feat: implement CustomAPIClient to allow all TabbyAPI parameters * fix: change to "temperature_last" instead of "temp_last" * feat: convert presets to dictionary mappings to make cleaner/more flexible * fix: account for original substring/in statements and remove TabbyAPI client call * fix: move down returning token values as it realistically should never be none, so substrings wouldn't be checked * chore: remove automatic 'token' import due to IDE --------- Co-authored-by: vegu-ai-tools <152010387+vegu-ai-tools@users.noreply.github.com> * tabbyapi client auto-set model name tabbyapi client use urljoin to prevent errors when user adds trailing slash * expose presets to config and ux for editing * some more help text * tweak min, max and step size for some of the inference parameter sliders * min_p step size to 0.01 * preset editor - allow reset to defaults * fix preset reset * dont perist inference_defaults to config file * only persist presets to config if they have been changed * ensure defaults are loaded * rename config to parameters for more clarity * update default inference params textgenwebui support for min_p, frequence_penalty and presence_penalty * overridable function to clean promp params * add `supported_parameters` class property to clients and revisit all of the clients to add any missing supported parameters * ux tweaks * support_parameters moved to propert function * top p decrease step size * only show audio stop button if there is actually audio playing * relock * allow setting presence and frequency penalty to 0 * lower default frequency penalty * frequency and presence penalty step size to 0.01 * set default model to gpt-4o --------- Co-authored-by: official-elinas <57051565+official-elinas@users.noreply.github.com>
talemate_frontend
Project setup
npm install
Compiles and hot-reloads for development
npm run serve
Compiles and minifies for production
npm run build
Lints and fixes files
npm run lint