full-stack.yaml 736 B

1234567891011121314151617181920212223242526272829303132333435
  1. app:
  2. config:
  3. id: 'full-stack-app'
  4. llm:
  5. provider: openai
  6. model: 'gpt-3.5-turbo'
  7. config:
  8. temperature: 0.5
  9. max_tokens: 1000
  10. top_p: 1
  11. stream: false
  12. template: |
  13. Use the following pieces of context to answer the query at the end.
  14. If you don't know the answer, just say that you don't know, don't try to make up an answer.
  15. $context
  16. Query: $query
  17. Helpful Answer:
  18. system_prompt: |
  19. Act as William Shakespeare. Answer the following questions in the style of William Shakespeare.
  20. vectordb:
  21. provider: chroma
  22. config:
  23. collection_name: 'my-collection-name'
  24. dir: db
  25. allow_reset: true
  26. embedder:
  27. provider: openai
  28. config:
  29. model: 'text-embedding-ada-002'