浏览代码

fix: xinference-chat-stream-response (#991)

takatost 1 年之前
父节点
当前提交
bd3a9b2f8d
共有 1 个文件被更改,包括 6 次插入1 次删除
  1. 6 1
      api/core/third_party/langchain/llms/xinference_llm.py

+ 6 - 1
api/core/third_party/langchain/llms/xinference_llm.py

@@ -123,7 +123,12 @@ class XinferenceLLM(Xinference):
                 if choices:
                     choice = choices[0]
                     if isinstance(choice, dict):
-                        token = choice.get("text", "")
+                        if 'text' in choice:
+                            token = choice.get("text", "")
+                        elif 'delta' in choice and 'content' in choice['delta']:
+                            token = choice.get('delta').get('content')
+                        else:
+                            continue
                         log_probs = choice.get("logprobs")
                         if run_manager:
                             run_manager.on_llm_new_token(