Jelajahi Sumber

feat: add streaming support for OpenSource App (#217)

aaishikdutta 2 tahun lalu
induk
melakukan
ae87dc4a6d
2 mengubah file dengan 3 tambahan dan 2 penghapusan
  1. 1 1
      README.md
  2. 2 1
      embedchain/embedchain.py

+ 1 - 1
README.md

@@ -224,7 +224,7 @@ print(naval_chat_bot.chat("what did the author say about happiness?"))
 
 ### Stream Response
 
-- You can add config to your query method to stream responses like ChatGPT does. You would require a downstream handler to render the chunk in your desirable format. Currently only supports OpenAI model.
+- You can add config to your query method to stream responses like ChatGPT does. You would require a downstream handler to render the chunk in your desirable format. Supports both OpenAI model and OpenSourceApp.
 
 - To use this, instantiate a `QueryConfig` or `ChatConfig` object with `stream=True`. Then pass it to the `.chat()` or `.query()` method. The following example iterates through the chunks and prints them as they appear.
 

+ 2 - 1
embedchain/embedchain.py

@@ -350,7 +350,7 @@ class OpenSourceApp(EmbedChain):
         print("Successfully loaded open source embedding model.")
         super().__init__(config)
 
-    def get_llm_model_answer(self, prompt):
+    def get_llm_model_answer(self, prompt, config: ChatConfig):
         from gpt4all import GPT4All
 
         global gpt4all_model
@@ -358,6 +358,7 @@ class OpenSourceApp(EmbedChain):
             gpt4all_model = GPT4All("orca-mini-3b.ggmlv3.q4_0.bin")
         response = gpt4all_model.generate(
             prompt=prompt,
+            streaming=config.stream
         )
         return response