Firmament-cyou
|
a0fe5947b7
|
Support stream_generate for LLMPipeline (#768)
* support streaming output for llm_pipeline
* add qwen2 format_messages
|
2024-03-05 10:05:18 +08:00 |
|
Firmament-cyou
|
8f0f9d4a33
|
Register llm format map (#659)
* add LLMAdapterRegistry
* fix bug
* replace traceback with cache
|
2023-12-06 16:22:50 +08:00 |
|
Firmament-cyou
|
fafb0fe013
|
llm pipeline support chatglm3 (#618)
|
2023-11-04 20:44:55 +08:00 |
|
hemu.zp
|
4cad376298
|
Add llm_first parameter for pipeline
Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/14264249
* support llm_first parameter
* register_module(Tasks.text_generation)
* fix bug
* update format & fix out_base64 for int4
* pre-commit
|
2023-10-13 14:04:04 +08:00 |
|
hemu.zp
|
d64cfa48bc
|
Support int4 model for llm_pipeline
Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/14214673
* use ms qwen update
* support baichuan2 int4
* update qwen vl unittest and fix bug
* init from AutoModelForCausalLM
* add todo for AutoModelForCausalLM
|
2023-10-09 16:16:33 +08:00 |
|
狄咖
|
74abc2e63f
|
[feat]chat pipeline
|
2023-09-28 00:54:41 +08:00 |
|