--- title: ❓ FAQs description: 'Collections of all the frequently asked questions' --- Yes, it does. Please refer to the [OpenAI Assistant docs page](/get-started/openai-assistant). Use the model provided on huggingface: `mistralai/Mistral-7B-v0.1` ```python main.py import os from embedchain import Pipeline as App os.environ["OPENAI_API_KEY"] = "sk-xxx" os.environ["HUGGINGFACE_ACCESS_TOKEN"] = "hf_your_token" app = App.from_config("huggingface.yaml") ``` ```yaml huggingface.yaml llm: provider: huggingface config: model: 'mistralai/Mistral-7B-v0.1' temperature: 0.5 max_tokens: 1000 top_p: 0.5 stream: false ``` Use the model `gpt-4-turbo` provided my openai. ```python main.py import os from embedchain import Pipeline as App os.environ['OPENAI_API_KEY'] = 'xxx' # load llm configuration from gpt4_turbo.yaml file app = App.from_config(config_path="gpt4_turbo.yaml") ``` ```yaml gpt4_turbo.yaml llm: provider: openai config: model: 'gpt-4-turbo' temperature: 0.5 max_tokens: 1000 top_p: 1 stream: false ``` ```python main.py import os from embedchain import Pipeline as App os.environ['OPENAI_API_KEY'] = 'xxx' # load llm configuration from gpt4.yaml file app = App.from_config(config_path="gpt4.yaml") ``` ```yaml gpt4.yaml llm: provider: openai config: model: 'gpt-4' temperature: 0.5 max_tokens: 1000 top_p: 1 stream: false ``` ```python main.py import os from embedchain import Pipeline as App os.environ['OPENAI_API_KEY'] = 'xxx' # load llm configuration from opensource.yaml file app = App.from_config(config_path="opensource.yaml") ``` ```yaml opensource.yaml llm: provider: gpt4all config: model: 'orca-mini-3b-gguf2-q4_0.gguf' temperature: 0.5 max_tokens: 1000 top_p: 1 stream: false embedder: provider: gpt4all config: model: 'all-MiniLM-L6-v2' ``` #### Still have questions? If docs aren't sufficient, please feel free to reach out to us using one of the following methods: