|
@@ -4,7 +4,7 @@ description: 'Collections of all the frequently asked questions'
|
|
|
---
|
|
|
<AccordionGroup>
|
|
|
<Accordion title="Does Embedchain support OpenAI's Assistant APIs?">
|
|
|
-Yes, it does. Please refer to the [OpenAI Assistant docs page](/get-started/openai-assistant).
|
|
|
+Yes, it does. Please refer to the [OpenAI Assistant docs page](/examples/openai-assistant).
|
|
|
</Accordion>
|
|
|
<Accordion title="How to use MistralAI language model?">
|
|
|
Use the model provided on huggingface: `mistralai/Mistral-7B-v0.1`
|
|
@@ -116,6 +116,36 @@ embedder:
|
|
|
```
|
|
|
</CodeGroup>
|
|
|
|
|
|
+</Accordion>
|
|
|
+<Accordion title="How to stream response while using OpenAI model in Embedchain?">
|
|
|
+You can achieve this by setting `stream` to `true` in the config file.
|
|
|
+
|
|
|
+<CodeGroup>
|
|
|
+```yaml openai.yaml
|
|
|
+llm:
|
|
|
+ provider: openai
|
|
|
+ config:
|
|
|
+ model: 'gpt-3.5-turbo'
|
|
|
+ temperature: 0.5
|
|
|
+ max_tokens: 1000
|
|
|
+ top_p: 1
|
|
|
+ stream: true
|
|
|
+```
|
|
|
+
|
|
|
+```python main.py
|
|
|
+import os
|
|
|
+from embedchain import Pipeline as App
|
|
|
+
|
|
|
+os.environ['OPENAI_API_KEY'] = 'sk-xxx'
|
|
|
+
|
|
|
+app = App.from_config(config_path="openai.yaml")
|
|
|
+
|
|
|
+app.add("https://www.forbes.com/profile/elon-musk")
|
|
|
+
|
|
|
+response = app.query("What is the net worth of Elon Musk?")
|
|
|
+# response will be streamed in stdout as it is generated.
|
|
|
+```
|
|
|
+</CodeGroup>
|
|
|
</Accordion>
|
|
|
</AccordionGroup>
|
|
|
|