* track time passage in scene using iso 8601 format
* chromadb openai instructions
model recommendations updated
* time context passed to long term memory
* add some pre-established history for testing purposes
* time passage
analyze dialogue to template
query_text template function
analyze text and answer question summarizer function
llm prompt template adjustments
iso8601 time utils
chromadb docs adjustments
* didnt mean to remove this
* fix ClientContext stacking
* conversation cleanup tweaks
* prompt prepared response padding
* fix some bugs causing conversation lines containing : to be terminated
early
* fixes issue with chara importing dialoge examples as huge blob instea of
splitting into lines
dialogue example in conversation template randomized
* llm prompt template for Speechless-Llama2-Hermes-Orca-Platypus-WizardLM
* version to 0.10.0
* fixes#2: character creator description generation will not honor changes to the content context
* decrease output of base attribute generation from 2-3 sentences to 1-2
sentences
* conversation agent tweaks
set other character names as stopping strings via client context
* xwin llm template
* conversation template tweaks
* fixes#6: agent busy status not always reflected in ux
* conversation min response length requirement reduced
include character base details with conversation prompt
* fixes#4: Prompt log
* reset prompt log on scene load
openai tokens as ? for now
* version to 0.9.0