Compare commits
16 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
95a17197ba | ||
|
|
d09f9f8ac4 | ||
|
|
de16feeed5 | ||
|
|
cdcc804ffa | ||
|
|
9a2bbd78a4 | ||
|
|
ddfbd6891b | ||
|
|
143dd47e02 | ||
|
|
cc7cb773d1 | ||
|
|
02c88f75a1 | ||
|
|
419371e0fb | ||
|
|
6e847bf283 | ||
|
|
ceedd3019f | ||
|
|
a28cf2a029 | ||
|
|
60cb271e30 | ||
|
|
1874234d2c | ||
|
|
ef99539e69 |
30
.github/workflows/ci.yml
vendored
Normal file
@@ -0,0 +1,30 @@
|
||||
name: ci
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
- main
|
||||
- prep-0.26.0
|
||||
permissions:
|
||||
contents: write
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Configure Git Credentials
|
||||
run: |
|
||||
git config user.name github-actions[bot]
|
||||
git config user.email 41898282+github-actions[bot]@users.noreply.github.com
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.x
|
||||
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
|
||||
- uses: actions/cache@v4
|
||||
with:
|
||||
key: mkdocs-material-${{ env.cache_id }}
|
||||
path: .cache
|
||||
restore-keys: |
|
||||
mkdocs-material-
|
||||
- run: pip install mkdocs-material mkdocs-awesome-pages-plugin mkdocs-glightbox
|
||||
- run: mkdocs gh-deploy --force
|
||||
1
.gitignore
vendored
@@ -9,6 +9,7 @@ talemate_env
|
||||
chroma
|
||||
config.yaml
|
||||
templates/llm-prompt/user/*.jinja2
|
||||
templates/world-state/*.yaml
|
||||
scenes/
|
||||
!scenes/infinity-quest-dynamic-scenario/
|
||||
!scenes/infinity-quest-dynamic-scenario/assets/
|
||||
|
||||
@@ -1,13 +1,19 @@
|
||||
# Use an official node runtime as a parent image
|
||||
FROM node:20
|
||||
|
||||
# Make sure we are in a development environment (this isn't a production ready Dockerfile)
|
||||
ENV NODE_ENV=development
|
||||
|
||||
# Echo that this isn't a production ready Dockerfile
|
||||
RUN echo "This Dockerfile is not production ready. It is intended for development purposes only."
|
||||
|
||||
# Set the working directory in the container
|
||||
WORKDIR /app
|
||||
|
||||
# Copy the frontend directory contents into the container at /app
|
||||
COPY ./talemate_frontend /app
|
||||
|
||||
# Install any needed packages specified in package.json
|
||||
# Install all dependencies
|
||||
RUN npm install
|
||||
|
||||
# Make port 8080 available to the world outside this container
|
||||
|
||||
228
README.md
@@ -16,8 +16,10 @@ Supported APIs:
|
||||
- [Google Gemini](https://console.cloud.google.com/)
|
||||
|
||||
Supported self-hosted APIs:
|
||||
- [KoboldCpp](https://koboldai.org/cpp) ([Local](https://koboldai.org/cpp), [Runpod](https://koboldai.org/runpodcpp), [VastAI](https://koboldai.org/vastcpp), also includes image gen support)
|
||||
- [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui) (local or with runpod support)
|
||||
- [LMStudio](https://lmstudio.ai/)
|
||||
- [TabbyAPI](https://github.com/theroyallab/tabbyAPI/)
|
||||
|
||||
Generic OpenAI api implementations (tested and confirmed working):
|
||||
- [DeepInfra](https://deepinfra.com/)
|
||||
@@ -26,230 +28,16 @@ Generic OpenAI api implementations (tested and confirmed working):
|
||||
|
||||
## Core Features
|
||||
|
||||
- Multiple AI agents for dialogue, narration, summarization, direction, editing, world state management, character/scenario creation, text-to-speech, and visual generation
|
||||
- Support for multiple AI clients and APIs
|
||||
- Long-term memory using ChromaDB and passage of time tracking
|
||||
- Multiple agents for dialogue, narration, summarization, direction, editing, world state management, character/scenario creation, text-to-speech, and visual generation
|
||||
- Supports per agent API selection
|
||||
- Long-term memory and passage of time tracking
|
||||
- Narrative world state management to reinforce character and world truths
|
||||
- Creative tools for managing NPCs, AI-assisted character, and scenario creation with template support
|
||||
- Context management for character details, world information, past events, and pinned information
|
||||
- Integration with Runpod
|
||||
- Customizable templates for all prompts using Jinja2
|
||||
- Modern, responsive UI
|
||||
|
||||
# Instructions
|
||||
## Documentation
|
||||
|
||||
Please read the documents in the `docs` folder for more advanced configuration and usage.
|
||||
|
||||
- [Quickstart](#quickstart)
|
||||
- [Installation](#installation)
|
||||
- [Windows](#windows)
|
||||
- [Linux](#linux)
|
||||
- [Docker](#docker)
|
||||
- [Connecting to an LLM](#connecting-to-an-llm)
|
||||
- [OpenAI / mistral.ai / Anthropic](#openai--mistralai--anthropic)
|
||||
- [Text-generation-webui / LMStudio](#text-generation-webui--lmstudio)
|
||||
- [Specifying the correct prompt template](#specifying-the-correct-prompt-template)
|
||||
- [Recommended Models](#recommended-models)
|
||||
- [DeepInfra via OpenAI Compatible client](#deepinfra-via-openai-compatible-client)
|
||||
- [Google Gemini](#google-gemini)
|
||||
- [Google Cloud Setup](#google-cloud-setup)
|
||||
- [Ready to go](#ready-to-go)
|
||||
- [Load the introductory scenario "Infinity Quest"](#load-the-introductory-scenario-infinity-quest)
|
||||
- [Loading character cards](#loading-character-cards)
|
||||
- [Text-to-Speech (TTS)](docs/tts.md)
|
||||
- [Visual Generation](docs/visual.md)
|
||||
- [ChromaDB (long term memory) configuration](docs/chromadb.md)
|
||||
- [Runpod Integration](docs/runpod.md)
|
||||
- [Prompt template overrides](docs/templates.md)
|
||||
|
||||
# Quickstart
|
||||
|
||||
## Installation
|
||||
|
||||
Post [here](https://github.com/vegu-ai/talemate/issues/17) if you run into problems during installation.
|
||||
|
||||
There is also a [troubleshooting guide](docs/troubleshoot.md) that might help.
|
||||
|
||||
### Windows
|
||||
|
||||
1. Download and install Python 3.10 or Python 3.11 from the [official Python website](https://www.python.org/downloads/windows/). :warning: python3.12 is currently not supported.
|
||||
1. Download and install Node.js v20 from the [official Node.js website](https://nodejs.org/en/download/). This will also install npm. :warning: v21 is currently not supported.
|
||||
1. Download the Talemate project to your local machine. Download from [the Releases page](https://github.com/vegu-ai/talemate/releases).
|
||||
1. Unpack the download and run `install.bat` by double clicking it. This will set up the project on your local machine.
|
||||
1. Once the installation is complete, you can start the backend and frontend servers by running `start.bat`.
|
||||
1. Navigate your browser to http://localhost:8080
|
||||
|
||||
### Linux
|
||||
|
||||
`python 3.10` or `python 3.11` is required. :warning: `python 3.12` not supported yet.
|
||||
|
||||
`nodejs v19 or v20` :warning: `v21` not supported yet.
|
||||
|
||||
1. `git clone https://github.com/vegu-ai/talemate.git`
|
||||
1. `cd talemate`
|
||||
1. `source install.sh`
|
||||
1. Start the backend: `python src/talemate/server/run.py runserver --host 0.0.0.0 --port 5050`.
|
||||
1. Open a new terminal, navigate to the `talemate_frontend` directory, and start the frontend server by running `npm run serve`.
|
||||
|
||||
### Docker
|
||||
|
||||
1. `git clone https://github.com/vegu-ai/talemate.git`
|
||||
1. `cd talemate`
|
||||
1. `docker-compose up`
|
||||
1. Navigate your browser to http://localhost:8080
|
||||
|
||||
:warning: When connecting local APIs running on the hostmachine (e.g. text-generation-webui), you need to use `host.docker.internal` as the hostname.
|
||||
|
||||
#### To shut down the Docker container
|
||||
|
||||
Just closing the terminal window will not stop the Docker container. You need to run `docker-compose down` to stop the container.
|
||||
|
||||
#### How to install Docker
|
||||
|
||||
1. Download and install Docker Desktop from the [official Docker website](https://www.docker.com/products/docker-desktop).
|
||||
|
||||
# Connecting to an LLM
|
||||
|
||||
On the right hand side click the "Add Client" button. If there is no button, you may need to toggle the client options by clicking this button:
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
## OpenAI / mistral.ai / Anthropic
|
||||
|
||||
The setup is the same for all three, the example below is for OpenAI.
|
||||
|
||||
If you want to add an OpenAI client, just change the client type and select the apropriate model.
|
||||
|
||||

|
||||
|
||||
If you are setting this up for the first time, you should now see the client, but it will have a red dot next to it, stating that it requires an API key.
|
||||
|
||||

|
||||
|
||||
Click the `SET API KEY` button. This will open a modal where you can enter your API key.
|
||||
|
||||

|
||||
|
||||
Click `Save` and after a moment the client should have a green dot next to it, indicating that it is ready to go.
|
||||
|
||||

|
||||
|
||||
## Text-generation-webui / LMStudio
|
||||
|
||||
> :warning: As of version 0.13.0 the legacy text-generator-webui API `--extension api` is no longer supported, please use their new `--extension openai` api implementation instead.
|
||||
|
||||
In the modal if you're planning to connect to text-generation-webui, you can likely leave everything as is and just click Save.
|
||||
|
||||

|
||||
|
||||
### Specifying the correct prompt template
|
||||
|
||||
For good results it is **vital** that the correct prompt template is specified for whichever model you have loaded.
|
||||
|
||||
Talemate does come with a set of pre-defined templates for some popular models, but going forward, due to the sheet number of models released every day, understanding and specifying the correct prompt template is something you should familiarize yourself with.
|
||||
|
||||
If the text-gen-webui client shows a yellow triangle next to it, it means that the prompt template is not set, and it is currently using the default `VICUNA` style prompt template.
|
||||
|
||||

|
||||
|
||||
Click the two cogwheels to the right of the triangle to open the client settings.
|
||||
|
||||

|
||||
|
||||
You can first try by clicking the `DETERMINE VIA HUGGINGFACE` button, depending on the model's README file, it may be able to determine the correct prompt template for you. (basically the readme needs to contain an example of the template)
|
||||
|
||||
If that doesn't work, you can manually select the prompt template from the dropdown.
|
||||
|
||||
In the case for `bartowski_Nous-Hermes-2-Mistral-7B-DPO-exl2_8_0` that is `ChatML` - select it from the dropdown and click `Save`.
|
||||
|
||||

|
||||
|
||||
### Recommended Models
|
||||
|
||||
As of 2024.03.07 my personal regular drivers (the ones i test with) are:
|
||||
|
||||
- Kunoichi-7B
|
||||
- sparsetral-16x7B
|
||||
- Nous-Hermes-2-Mistral-7B-DPO
|
||||
- brucethemoose_Yi-34B-200K-RPMerge
|
||||
- dolphin-2.7-mixtral-8x7b
|
||||
- rAIfle_Verdict-8x7B
|
||||
- Mixtral-8x7B-instruct
|
||||
|
||||
That said, any of the top models in any of the size classes here should work well (i wouldn't recommend going lower than 7B):
|
||||
|
||||
https://www.reddit.com/r/LocalLLaMA/comments/18yp9u4/llm_comparisontest_api_edition_gpt4_vs_gemini_vs/
|
||||
|
||||
## DeepInfra via OpenAI Compatible client
|
||||
|
||||
You can use the OpenAI compatible client to connect to [DeepInfra](https://deepinfra.com/).
|
||||
|
||||

|
||||
|
||||
```
|
||||
API URL: https://api.deepinfra.com/v1/openai
|
||||
```
|
||||
|
||||
Models on DeepInfra that work well with Talemate:
|
||||
|
||||
- [mistralai/Mixtral-8x7B-Instruct-v0.1](https://deepinfra.com/mistralai/Mixtral-8x7B-Instruct-v0.1) (max context 32k, 8k recommended)
|
||||
- [cognitivecomputations/dolphin-2.6-mixtral-8x7b](https://deepinfra.com/cognitivecomputations/dolphin-2.6-mixtral-8x7b) (max context 32k, 8k recommended)
|
||||
- [lizpreciatior/lzlv_70b_fp16_hf](https://deepinfra.com/lizpreciatior/lzlv_70b_fp16_hf) (max context 4k)
|
||||
|
||||
## Google Gemini
|
||||
|
||||
### Google Cloud Setup
|
||||
|
||||
Unlike the other clients the setup for Google Gemini is a bit more involved as you will need to set up a google cloud project and credentials for it.
|
||||
|
||||
Please follow their [instructions for setup](https://cloud.google.com/vertex-ai/docs/start/client-libraries) - which includes setting up a project, enabling the Vertex AI API, creating a service account, and downloading the credentials.
|
||||
|
||||
Once you have downloaded the credentials, copy the JSON file into the talemate directory. You can rename it to something that's easier to remember, like `my-credentials.json`.
|
||||
|
||||
### Add the client
|
||||
|
||||

|
||||
|
||||
The `Disable Safety Settings` option will turn off the google reponse validation for what they consider harmful content. Use at your own risk.
|
||||
|
||||
### Conmplete the google cloud setup in talemate
|
||||
|
||||

|
||||
|
||||
Click the `SETUP GOOGLE API CREDENTIALS` button that will appear on the client.
|
||||
|
||||
The google cloud setup modal will appear, fill in the path to the credentials file and select a location that is close to you.
|
||||
|
||||

|
||||
|
||||
Click save and after a moment the client should have a green dot next to it, indicating that it is ready to go.
|
||||
|
||||

|
||||
|
||||
## Ready to go
|
||||
|
||||
You will know you are good to go when the client and all the agents have a green dot next to them.
|
||||
|
||||

|
||||
|
||||
## Load the introductory scenario "Infinity Quest"
|
||||
|
||||
Generated using talemate creative tools, mostly used for testing / demoing.
|
||||
|
||||
You can load it (and any other talemate scenarios or save files) by expanding the "Load" menu in the top left corner and selecting the middle tab. Then simple search for a partial name of the scenario you want to load and click on the result.
|
||||
|
||||

|
||||
|
||||
## Loading character cards
|
||||
|
||||
Supports both v1 and v2 chara specs.
|
||||
|
||||
Expand the "Load" menu in the top left corner and either click on "Upload a character card" or simply drag and drop a character card file into the same area.
|
||||
|
||||

|
||||
|
||||
Once a character is uploaded, talemate may actually take a moment because it needs to convert it to a talemate format and will also run additional LLM prompts to generate character attributes and world state.
|
||||
|
||||
Make sure you save the scene after the character is loaded as it can then be loaded as normal talemate scenario in the future.
|
||||
- [Installation and Getting started](https://vegu-ai.github.io/talemate/)
|
||||
- [User Guide](https://vegu-ai.github.io/talemate/user-guide/interacting/)
|
||||
@@ -7,40 +7,6 @@ creator:
|
||||
- a thrilling action story
|
||||
- a mysterious adventure
|
||||
- an epic sci-fi adventure
|
||||
game:
|
||||
world_state:
|
||||
templates:
|
||||
state_reinforcement:
|
||||
Goals:
|
||||
auto_create: false
|
||||
description: Long term and short term goals
|
||||
favorite: true
|
||||
insert: conversation-context
|
||||
instructions: Create a long term goal and two short term goals for {character_name}. Your response must only be the long terms and two short term goals.
|
||||
interval: 20
|
||||
name: Goals
|
||||
query: Goals
|
||||
state_type: npc
|
||||
Physical Health:
|
||||
auto_create: false
|
||||
description: Keep track of health.
|
||||
favorite: true
|
||||
insert: sequential
|
||||
instructions: ''
|
||||
interval: 10
|
||||
name: Physical Health
|
||||
query: What is {character_name}'s current physical health status?
|
||||
state_type: character
|
||||
Time of day:
|
||||
auto_create: false
|
||||
description: Track night / day cycle
|
||||
favorite: true
|
||||
insert: sequential
|
||||
instructions: ''
|
||||
interval: 10
|
||||
name: Time of day
|
||||
query: What is the current time of day?
|
||||
state_type: world
|
||||
|
||||
## Long-term memory
|
||||
|
||||
|
||||
@@ -23,5 +23,5 @@ services:
|
||||
dockerfile: Dockerfile.frontend
|
||||
ports:
|
||||
- "8080:8080"
|
||||
volumes:
|
||||
- ./talemate_frontend:/app
|
||||
#volumes:
|
||||
# - ./talemate_frontend:/app
|
||||
|
||||
5
docs/.pages
Normal file
@@ -0,0 +1,5 @@
|
||||
nav:
|
||||
- Home: index.md
|
||||
- Getting started: getting-started
|
||||
- User guide: user-guide
|
||||
- Developer guide: dev
|
||||
3
docs/dev/index.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Coning soon
|
||||
|
||||
Developer documentation is coming soon. Stay tuned!
|
||||
@@ -1,4 +1,7 @@
|
||||
# Template Overrides in Talemate
|
||||
# Template Overrides
|
||||
|
||||
!!! warning "Old documentation"
|
||||
This is old documentation and needs to be updated, however may still contain useful information.
|
||||
|
||||
## Introduction to Templates
|
||||
|
||||
@@ -23,9 +26,9 @@ The creator agent templates allow for the creation of new characters within the
|
||||
|
||||
### Example Templates
|
||||
|
||||
- [Character Attributes Human Template](src/talemate/prompts/templates/creator/character-attributes-human.jinja2)
|
||||
- [Character Details Human Template](src/talemate/prompts/templates/creator/character-details-human.jinja2)
|
||||
- [Character Example Dialogue Human Template](src/talemate/prompts/templates/creator/character-example-dialogue-human.jinja2)
|
||||
- `src/talemate/prompts/templates/creator/character-attributes-human.jinja2`
|
||||
- `src/talemate/prompts/templates/creator/character-details-human.jinja2`
|
||||
- `src/talemate/prompts/templates/creator/character-example-dialogue-human.jinja2`
|
||||
|
||||
These example templates can serve as a guide for users to create their own custom templates for the character creator.
|
||||
|
||||
5
docs/getting-started/.pages
Normal file
@@ -0,0 +1,5 @@
|
||||
nav:
|
||||
- 1. Installation: installation
|
||||
- 2. Connect a client: connect-a-client.md
|
||||
- 3. Load a scene: load-a-scene.md
|
||||
- ...
|
||||
68
docs/getting-started/connect-a-client.md
Normal file
@@ -0,0 +1,68 @@
|
||||
# Connect a client
|
||||
|
||||
Once Talemate is up and running and you are connected, you will see a notification in the corner instructing you to configured a client.
|
||||
|
||||

|
||||
|
||||
Talemate uses client(s) to connect to local or remote AI text generation APIs like koboldcpp, text-generation-webui or OpenAI.
|
||||
|
||||
## Add a new client
|
||||
|
||||
On the right hand side click the **:material-plus-box: ADD CLIENT** button.
|
||||
|
||||

|
||||
|
||||
!!! note "No button?"
|
||||
If there is no button, you may need to toggle the client options by clicking this button
|
||||
|
||||

|
||||
|
||||
The client configuration window will appear. Here you can choose the type of client you want to add.
|
||||
|
||||

|
||||
|
||||
## Choose an API / Client Type
|
||||
|
||||
We have support for multiple local and remote APIs. You can choose to use one or more of them.
|
||||
|
||||
!!! note "Local vs remote"
|
||||
A local API runs on your machine, while a remote API runs on a server somewhere else.
|
||||
|
||||
Select the API you want to use and click through to follow the instructions to configure a client for it:
|
||||
|
||||
##### Remote APIs
|
||||
|
||||
- [OpenAI](/talemate/user-guide/clients/types/openai/)
|
||||
- [Anthropic](/talemate/user-guide/clients/types/anthropic/)
|
||||
- [mistral.ai](/talemate/user-guide/clients/types/mistral/)
|
||||
- [Cohere](/talemate/user-guide/clients/types/cohere/)
|
||||
- [Groq](/talemate/user-guide/clients/types/groq/)
|
||||
- [Google Gemini](/talemate/user-guide/clients/types/google/)
|
||||
|
||||
##### Local APIs
|
||||
|
||||
- [KoboldCpp](/talemate/user-guide/clients/types/koboldcpp/)
|
||||
- [Text-Generation-WebUI](/talemate/user-guide/clients/types/text-generation-webui/)
|
||||
- [LMStudio](/talemate/user-guide/clients/types/lmstudio/)
|
||||
- [TabbyAPI](/talemate/user-guide/clients/types/tabbyapi/)
|
||||
|
||||
##### Unofficial OpenAI API implementations
|
||||
|
||||
- [DeepInfra](/talemate/user-guide/clients/types/openai-compatible/#deepinfra)
|
||||
- llamacpp with the `api_like_OAI.py` wrapper
|
||||
|
||||
## Assign the client to the agents
|
||||
|
||||
Whenever you add your first client, Talemate will automatically assign it to all agents. Once the client is configured and assigned, all agents should have a green dot next to them. (Or grey if the agent is currently disabled)
|
||||
|
||||

|
||||
|
||||
You can tell the client is assigned to the agent by checking the tag beneath the agent name, which will contain the client name if it is assigned.
|
||||
|
||||

|
||||
|
||||
## Its not assigned!
|
||||
|
||||
If for some reason the client is not assigned to the agent, you can manually assign it to all agents by clicking the **:material-transit-connection-variant: Assign to all agents** button.
|
||||
|
||||

|
||||
5
docs/getting-started/installation/.pages
Normal file
@@ -0,0 +1,5 @@
|
||||
nav:
|
||||
- windows.md
|
||||
- linux.md
|
||||
- docker.md
|
||||
- ...
|
||||
17
docs/getting-started/installation/docker.md
Normal file
@@ -0,0 +1,17 @@
|
||||
!!! example "Experimental"
|
||||
Talemate through docker has not received a lot of testing from me, so please let me know if you encounter any issues.
|
||||
|
||||
You can do so by creating an issue on the [:material-github: GitHub repository](https://github.com/vegu-ai/talemate)
|
||||
|
||||
## Quick install instructions
|
||||
|
||||
1. `git clone https://github.com/vegu-ai/talemate.git`
|
||||
1. `cd talemate`
|
||||
1. copy config file
|
||||
1. linux: `cp config.example.yaml config.yaml`
|
||||
1. windows: `copy config.example.yaml config.yaml`
|
||||
1. `docker compose up`
|
||||
1. Navigate your browser to http://localhost:8080
|
||||
|
||||
!!! note
|
||||
When connecting local APIs running on the hostmachine (e.g. text-generation-webui), you need to use `host.docker.internal` as the hostname.
|
||||
@@ -1,3 +1,19 @@
|
||||
|
||||
## Quick install instructions
|
||||
|
||||
!!! warning
|
||||
python 3.12 and node.js v21 are currently not supported.
|
||||
|
||||
1. `git clone https://github.com/vegu-ai/talemate.git`
|
||||
1. `cd talemate`
|
||||
1. `source install.sh`
|
||||
1. Start the backend: `python src/talemate/server/run.py runserver --host 0.0.0.0 --port 5050`.
|
||||
1. Open a new terminal, navigate to the `talemate_frontend` directory, and start the frontend server by running `npm run serve`.
|
||||
|
||||
If everything went well, you can proceed to [connect a client](../../connect-a-client).
|
||||
|
||||
## Additional Information
|
||||
|
||||
### Setting Up a Virtual Environment
|
||||
|
||||
1. Open a terminal.
|
||||
28
docs/getting-started/installation/troubleshoot.md
Normal file
@@ -0,0 +1,28 @@
|
||||
# Common issues
|
||||
|
||||
## Windows
|
||||
|
||||
### Installation fails with "Microsoft Visual C++" error
|
||||
|
||||
If your installation errors with a notification to upgrade "Microsoft Visual C++" go to https://visualstudio.microsoft.com/visual-cpp-build-tools/ and click "Download Build Tools" and run it.
|
||||
|
||||
- During installation make sure you select the C++ development package (upper left corner)
|
||||
- Run `reinstall.bat` inside talemate directory
|
||||
|
||||
## Docker
|
||||
|
||||
### Docker has created `config.yaml` directory
|
||||
|
||||
If you do not copy the example config to `config.yaml` before running `docker compose up` docker will create a `config` directory in the root of the project. This will cause the backend to fail to start.
|
||||
|
||||
This happens because we mount the config file directly as a docker volume, and if it does not exist docker will create a directory with the same name.
|
||||
|
||||
This will eventually be fixed, for now please make sure to copy the example config file before running the docker compose command.
|
||||
|
||||
## General
|
||||
|
||||
### Running behind reverse proxy with ssl
|
||||
|
||||
Personally i have not been able to make this work yet, but its on my list, issue stems from some vue oddities when specifying the base urls while running in a dev environment. I expect once i start building the project for production this will be resolved.
|
||||
|
||||
If you do make it work, please reach out to me so i can update this documentation.
|
||||
@@ -1,16 +1,31 @@
|
||||
### How to Install Python 3.10
|
||||
## Quick install instructions
|
||||
|
||||
1. Visit the official Python website's download page for Windows at https://www.python.org/downloads/windows/.
|
||||
!!! warning
|
||||
python 3.12 and node.js v21 are currently not supported.
|
||||
|
||||
1. Download and install Python 3.10 or Python 3.11 from the [official Python website](https://www.python.org/downloads/windows/).
|
||||
1. Download and install Node.js v20 from the [official Node.js website](https://nodejs.org/en/download/). This will also install npm.
|
||||
1. Download the Talemate project to your local machine. Download from [the Releases page](https://github.com/vegu-ai/talemate/releases).
|
||||
1. Unpack the download and run `install.bat` by double clicking it. This will set up the project on your local machine.
|
||||
1. Once the installation is complete, you can start the backend and frontend servers by running `start.bat`.
|
||||
1. Navigate your browser to http://localhost:8080
|
||||
|
||||
If everything went well, you can proceed to [connect a client](../../connect-a-client).
|
||||
|
||||
## Additional Information
|
||||
|
||||
### How to Install Python 3.10 or 3.11
|
||||
|
||||
1. Visit the official Python website's download page for Windows at [https://www.python.org/downloads/windows/](https://www.python.org/downloads/windows/).
|
||||
2. Click on the link for the Latest Python 3 Release - Python 3.10.x.
|
||||
3. Scroll to the bottom and select either Windows x86-64 executable installer for 64-bit or Windows x86 executable installer for 32-bit.
|
||||
4. Run the installer file and follow the setup instructions. Make sure to check the box that says Add Python 3.10 to PATH before you click Install Now.
|
||||
|
||||
### How to Install npm
|
||||
|
||||
1. Download Node.js from the official site https://nodejs.org/en/download/.
|
||||
1. Download Node.js from the official site [https://nodejs.org/en/download/](https://nodejs.org/en/download/).
|
||||
2. Run the installer (the .msi installer is recommended).
|
||||
3. Follow the prompts in the installer (Accept the license agreement, click the NEXT button a bunch of times and accept the default installation settings).
|
||||
4. Restart your computer. You won’t be able to run Node.js® until you restart your computer.
|
||||
|
||||
### Usage of the Supplied bat Files
|
||||
|
||||
54
docs/getting-started/load-a-scene.md
Normal file
@@ -0,0 +1,54 @@
|
||||
# Load a scenario
|
||||
|
||||
Once you've set up a client and assigned it to all the agents, you will be presented with the `Home` screen. From here, you can load talemate scenarios and upload character cards.
|
||||
|
||||
To load the introductory `Infinity Quest` scenario, simply click on its entry in the `Quick Load` section.
|
||||
|
||||

|
||||
|
||||
## Interacting with the scenario
|
||||
|
||||
After a moment of loading, you will see the scenario's introductory message and be able to send a text interaction.
|
||||
|
||||

|
||||
|
||||
Its time to send the first message.
|
||||
|
||||
Spoken words should go into `"` and actions should be written in `*`. Talemate will automatically supply the other if you supply one.
|
||||
|
||||

|
||||
|
||||
Once sent, its now the AI's turn to respond - depending on the service and model selected this can take a a moment.
|
||||
|
||||

|
||||
|
||||
## Quick overview of UI elements
|
||||
|
||||
### Scenario tools
|
||||
|
||||
Above the chat input there is a set of tools to help you interact with the scenario.
|
||||
|
||||

|
||||
|
||||
These contain tools to, for example:
|
||||
|
||||
- regenrate the most recent AI response
|
||||
- give directions to characters
|
||||
- narrate the scene
|
||||
- advance time
|
||||
- save the current scene state
|
||||
- and more ...
|
||||
|
||||
A full guide can be found in the [Scenario Tools](/talemate/user-guide/scenario-tools) section of the user guide.
|
||||
|
||||
### World state
|
||||
|
||||
Shows a sumamrization of the current scene state.
|
||||
|
||||

|
||||
|
||||
Each item can be expanded for more information.
|
||||
|
||||

|
||||
|
||||
Find out more about the world state in the [World State](/talemate/user-guide/world-state) section of the user guide.
|
||||
BIN
docs/img/0.26.0/agent-disabled.png
Normal file
|
After Width: | Height: | Size: 1.1 KiB |
BIN
docs/img/0.26.0/agent-enabled.png
Normal file
|
After Width: | Height: | Size: 1.6 KiB |
BIN
docs/img/0.26.0/agent-has-client-assigned.png
Normal file
|
After Width: | Height: | Size: 1.1 KiB |
BIN
docs/img/0.26.0/anthropic-settings.png
Normal file
|
After Width: | Height: | Size: 43 KiB |
BIN
docs/img/0.26.0/auto-progress-off.png
Normal file
|
After Width: | Height: | Size: 1.4 KiB |
BIN
docs/img/0.26.0/autosave-blocked.png
Normal file
|
After Width: | Height: | Size: 1.3 KiB |
BIN
docs/img/0.26.0/autosave-disabled.png
Normal file
|
After Width: | Height: | Size: 1.2 KiB |
BIN
docs/img/0.26.0/autosave-enabled.png
Normal file
|
After Width: | Height: | Size: 1.3 KiB |
BIN
docs/img/0.26.0/client-anthropic-no-api-key.png
Normal file
|
After Width: | Height: | Size: 8.7 KiB |
BIN
docs/img/0.26.0/client-anthropic-ready.png
Normal file
|
After Width: | Height: | Size: 8.0 KiB |
BIN
docs/img/0.26.0/client-anthropic.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
docs/img/0.26.0/client-assigned-prompt-template.png
Normal file
|
After Width: | Height: | Size: 46 KiB |
BIN
docs/img/0.26.0/client-cohere-no-api-key.png
Normal file
|
After Width: | Height: | Size: 8.4 KiB |
BIN
docs/img/0.26.0/client-cohere-ready.png
Normal file
|
After Width: | Height: | Size: 7.0 KiB |
BIN
docs/img/0.26.0/client-cohere.png
Normal file
|
After Width: | Height: | Size: 20 KiB |
BIN
docs/img/0.26.0/client-deepinfra-ready.png
Normal file
|
After Width: | Height: | Size: 8.6 KiB |
BIN
docs/img/0.26.0/client-deepinfra.png
Normal file
|
After Width: | Height: | Size: 78 KiB |
BIN
docs/img/0.26.0/client-google-creds-missing.png
Normal file
|
After Width: | Height: | Size: 9.3 KiB |
BIN
docs/img/0.26.0/client-google-ready.png
Normal file
|
After Width: | Height: | Size: 7.9 KiB |
BIN
docs/img/0.26.0/client-google.png
Normal file
|
After Width: | Height: | Size: 26 KiB |
BIN
docs/img/0.26.0/client-groq-no-api-key.png
Normal file
|
After Width: | Height: | Size: 8.1 KiB |
BIN
docs/img/0.26.0/client-groq-ready.png
Normal file
|
After Width: | Height: | Size: 6.9 KiB |
BIN
docs/img/0.26.0/client-groq.png
Normal file
|
After Width: | Height: | Size: 20 KiB |
BIN
docs/img/0.26.0/client-hibernate-1.png
Normal file
|
After Width: | Height: | Size: 19 KiB |
BIN
docs/img/0.26.0/client-hibernate-2.png
Normal file
|
After Width: | Height: | Size: 10 KiB |
BIN
docs/img/0.26.0/client-koboldcpp-could-not-connect.png
Normal file
|
After Width: | Height: | Size: 7.5 KiB |
BIN
docs/img/0.26.0/client-koboldcpp-ready.png
Normal file
|
After Width: | Height: | Size: 7.8 KiB |
BIN
docs/img/0.26.0/client-koboldcpp.png
Normal file
|
After Width: | Height: | Size: 28 KiB |
BIN
docs/img/0.26.0/client-lmstudio-could-not-connect.png
Normal file
|
After Width: | Height: | Size: 7.3 KiB |
BIN
docs/img/0.26.0/client-lmstudio-ready.png
Normal file
|
After Width: | Height: | Size: 7.7 KiB |
BIN
docs/img/0.26.0/client-lmstudio.png
Normal file
|
After Width: | Height: | Size: 27 KiB |
BIN
docs/img/0.26.0/client-mistral-no-api-key.png
Normal file
|
After Width: | Height: | Size: 8.6 KiB |
BIN
docs/img/0.26.0/client-mistral-ready.png
Normal file
|
After Width: | Height: | Size: 7.6 KiB |
BIN
docs/img/0.26.0/client-mistral.png
Normal file
|
After Width: | Height: | Size: 21 KiB |
BIN
docs/img/0.26.0/client-ooba-could-not-connect.png
Normal file
|
After Width: | Height: | Size: 8.2 KiB |
BIN
docs/img/0.26.0/client-ooba-no-model-loaded.png
Normal file
|
After Width: | Height: | Size: 8.1 KiB |
BIN
docs/img/0.26.0/client-ooba-ready.png
Normal file
|
After Width: | Height: | Size: 9.2 KiB |
BIN
docs/img/0.26.0/client-ooba.png
Normal file
|
After Width: | Height: | Size: 29 KiB |
BIN
docs/img/0.26.0/client-openai-no-api-key.png
Normal file
|
After Width: | Height: | Size: 8.5 KiB |
BIN
docs/img/0.26.0/client-openai-ready.png
Normal file
|
After Width: | Height: | Size: 6.7 KiB |
BIN
docs/img/0.26.0/client-openai.png
Normal file
|
After Width: | Height: | Size: 20 KiB |
BIN
docs/img/0.26.0/client-tabbyapi-could-not-connect.png
Normal file
|
After Width: | Height: | Size: 7.4 KiB |
BIN
docs/img/0.26.0/client-tabbyapi-ready.png
Normal file
|
After Width: | Height: | Size: 8.3 KiB |
BIN
docs/img/0.26.0/client-tabbyapi.png
Normal file
|
After Width: | Height: | Size: 39 KiB |
BIN
docs/img/0.26.0/client-unknown-prompt-template-modal.png
Normal file
|
After Width: | Height: | Size: 49 KiB |
BIN
docs/img/0.26.0/client-unknown-prompt-template.png
Normal file
|
After Width: | Height: | Size: 18 KiB |
BIN
docs/img/0.26.0/cohere-settings.png
Normal file
|
After Width: | Height: | Size: 42 KiB |
BIN
docs/img/0.26.0/connect-a-client-add-client-modal.png
Normal file
|
After Width: | Height: | Size: 29 KiB |
BIN
docs/img/0.26.0/connect-a-client-add-client.png
Normal file
|
After Width: | Height: | Size: 7.1 KiB |
BIN
docs/img/0.26.0/connect-a-client-assign-to-all-agents.png
Normal file
|
After Width: | Height: | Size: 10 KiB |
BIN
docs/img/0.26.0/connect-a-client-ready.png
Normal file
|
After Width: | Height: | Size: 43 KiB |
BIN
docs/img/0.26.0/conversation-agent-settings.png
Normal file
|
After Width: | Height: | Size: 64 KiB |
BIN
docs/img/0.26.0/create-new-scene-test.png
Normal file
|
After Width: | Height: | Size: 246 KiB |
BIN
docs/img/0.26.0/create-new-scene.png
Normal file
|
After Width: | Height: | Size: 3.4 KiB |
BIN
docs/img/0.26.0/director-agent-settings.png
Normal file
|
After Width: | Height: | Size: 46 KiB |
BIN
docs/img/0.26.0/editor-agent-settings.png
Normal file
|
After Width: | Height: | Size: 54 KiB |
BIN
docs/img/0.26.0/elevenlabs-ready.png
Normal file
|
After Width: | Height: | Size: 3.1 KiB |
BIN
docs/img/0.26.0/elevenlabs-settings-enabled.png
Normal file
|
After Width: | Height: | Size: 26 KiB |
BIN
docs/img/0.26.0/elevenlabs-settings.png
Normal file
|
After Width: | Height: | Size: 46 KiB |
BIN
docs/img/0.26.0/elevenlabs-voice-selection.png
Normal file
|
After Width: | Height: | Size: 26 KiB |
BIN
docs/img/0.26.0/getting-started-first-ai-response.png
Normal file
|
After Width: | Height: | Size: 111 KiB |
BIN
docs/img/0.26.0/getting-started-first-interaction.png
Normal file
|
After Width: | Height: | Size: 27 KiB |
BIN
docs/img/0.26.0/getting-started-load-screen.png
Normal file
|
After Width: | Height: | Size: 634 KiB |
BIN
docs/img/0.26.0/getting-started-scene-1.png
Normal file
|
After Width: | Height: | Size: 441 KiB |
BIN
docs/img/0.26.0/getting-started-ui-element-tools.png
Normal file
|
After Width: | Height: | Size: 5.6 KiB |
BIN
docs/img/0.26.0/getting-started-world-state-1.png
Normal file
|
After Width: | Height: | Size: 289 KiB |
BIN
docs/img/0.26.0/getting-started-world-state-2.png
Normal file
|
After Width: | Height: | Size: 14 KiB |
BIN
docs/img/0.26.0/google-settings.png
Normal file
|
After Width: | Height: | Size: 67 KiB |
BIN
docs/img/0.26.0/groq-settings.png
Normal file
|
After Width: | Height: | Size: 41 KiB |
BIN
docs/img/0.26.0/inference-presets-1.png
Normal file
|
After Width: | Height: | Size: 82 KiB |
BIN
docs/img/0.26.0/interacting-input-act-as-character.png
Normal file
|
After Width: | Height: | Size: 5.6 KiB |
BIN
docs/img/0.26.0/interacting-input-act-as-narrator.png
Normal file
|
After Width: | Height: | Size: 5.8 KiB |
BIN
docs/img/0.26.0/interacting-input-request.png
Normal file
|
After Width: | Height: | Size: 5.4 KiB |
BIN
docs/img/0.26.0/mistral-settings.png
Normal file
|
After Width: | Height: | Size: 42 KiB |
BIN
docs/img/0.26.0/narrator-agent-settings.png
Normal file
|
After Width: | Height: | Size: 45 KiB |
BIN
docs/img/0.26.0/no-api-token.png
Normal file
|
After Width: | Height: | Size: 1.0 KiB |
BIN
docs/img/0.26.0/no-clients.png
Normal file
|
After Width: | Height: | Size: 10 KiB |
BIN
docs/img/0.26.0/open-clients.png
Normal file
|
After Width: | Height: | Size: 1.5 KiB |
BIN
docs/img/0.26.0/open-settings.png
Normal file
|
After Width: | Height: | Size: 1.4 KiB |
BIN
docs/img/0.26.0/openai-settings.png
Normal file
|
After Width: | Height: | Size: 42 KiB |
BIN
docs/img/0.26.0/runpod-api-key-setting.png
Normal file
|
After Width: | Height: | Size: 44 KiB |
BIN
docs/img/0.26.0/scene-save.png
Normal file
|
After Width: | Height: | Size: 11 KiB |
BIN
docs/img/0.26.0/scene-tool-character-actions.png
Normal file
|
After Width: | Height: | Size: 13 KiB |