app: config: id: 'default-app' llm: provider: openai config: model: 'gpt-3.5-turbo' temperature: 0.5 max_tokens: 1000 top_p: 1 stream: false template: | Use the following pieces of context to answer the query at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. $context Query: $query Helpful Answer: vectordb: provider: chroma config: collection_name: 'rest-api-app' dir: db allow_reset: true embedder: provider: openai config: model: 'text-embedding-ada-002'