app_types.mdx 6.0 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214
  1. ---
  2. title: '📱 App types'
  3. ---
  4. ## App Types
  5. Embedchain supports a variety of LLMs, embedding functions/models and vector databases.
  6. Our app gives you full control over which components you want to use, you can mix and match them to your hearts content.
  7. <Tip>
  8. Out of the box, if you just use `app = App()`, Embedchain uses what we believe to be the best configuration available. This might include paid/proprietary components. Currently, this is
  9. * LLM: OpenAi (gpt-3.5-turbo)
  10. * Embedder: OpenAi (text-embedding-ada-002)
  11. * Database: ChromaDB
  12. </Tip>
  13. ### LLM
  14. #### Choosing an LLM
  15. The following LLM providers are supported by Embedchain:
  16. - OPENAI
  17. - ANTHPROPIC
  18. - VERTEX_AI
  19. - GPT4ALL
  20. - AZURE_OPENAI
  21. - LLAMA2
  22. - COHERE
  23. You can choose one by importing it from `embedchain.llm`. E.g.:
  24. ```python
  25. from embedchain import App
  26. from embedchain.llm.llama2 import Llama2Llm
  27. app = App(llm=Llama2Llm())
  28. ```
  29. #### Configuration
  30. The LLMs can be configured by passing an LlmConfig object.
  31. The config options can be found [here](/advanced/query_configuration#llmconfig)
  32. ```python
  33. from embedchain import App
  34. from embedchain.llm.llama2 import Llama2Llm
  35. from embedchain.config import LlmConfig
  36. app = App(llm=Llama2Llm(), llm_config=LlmConfig(number_documents=3, temperature=0))
  37. ```
  38. ### Embedder
  39. #### Choosing an Embedder
  40. The following providers for embedding functions are supported by Embedchain:
  41. - OPENAI
  42. - HUGGING_FACE
  43. - VERTEX_AI
  44. - GPT4ALL
  45. - AZURE_OPENAI
  46. You can choose one by importing it from `embedchain.embedder`. E.g.:
  47. ```python
  48. from embedchain import App
  49. from embedchain.embedder.vertexai import VertexAiEmbedder
  50. app = App(embedder=VertexAiEmbedder())
  51. ```
  52. #### Configuration
  53. The LLMs can be configured by passing an EmbedderConfig object.
  54. ```python
  55. from embedchain import App
  56. from embedchain.embedder.openai import OpenAiEmbedder
  57. from embedchain.config import EmbedderConfig
  58. app = App(embedder=OpenAiEmbedder(), embedder_config=EmbedderConfig(model="text-embedding-ada-002"))
  59. ```
  60. <Tip>
  61. You can also pass an `LlmConfig` instance directly to the `query` or `chat` method.
  62. This creates a temporary config for that request alone, so you could, for example, use a different model (from the same provider) or get more context documents for a specific query.
  63. </Tip>
  64. ### Vector Database
  65. #### Choosing a Vector Database
  66. The following vector databases are supported by Embedchain:
  67. - ChromaDB
  68. - Elasticsearch
  69. You can choose one by importing it from `embedchain.vectordb`. E.g.:
  70. ```python
  71. from embedchain import App
  72. from embedchain.vectordb.elasticsearch import ElasticsearchDB
  73. app = App(db=ElasticsearchDB())
  74. ```
  75. #### Configuration
  76. The vector databases can be configured by passing a specific config object.
  77. These vary greatly between the different vector databases.
  78. ```python
  79. from embedchain import App
  80. from embedchain.vectordb.elasticsearch import ElasticsearchDB
  81. from embedchain.config import ElasticsearchDBConfig
  82. app = App(db=ElasticsearchDB(), db_config=ElasticsearchDBConfig())
  83. ```
  84. ### PersonApp
  85. ```python
  86. from embedchain import PersonApp
  87. naval_chat_bot = PersonApp("name_of_person_or_character") #Like "Yoda"
  88. ```
  89. - `PersonApp` uses OpenAI's model, so these are paid models. 💸 You will be charged for embedding model usage and LLM usage.
  90. - `PersonApp` uses OpenAI's embedding model to create embeddings for chunks and ChatGPT API as LLM to get answer given the relevant docs. Make sure that you have an OpenAI account and an API key. If you don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
  91. - Once you have the API key, set it in an environment variable called `OPENAI_API_KEY`
  92. ```python
  93. import os
  94. os.environ["OPENAI_API_KEY"] = "sk-xxxx"
  95. ```
  96. ### Full Configuration Examples
  97. Embedchain previously offered fully configured classes, namely `App`, `OpenSourceApp`, `CustomApp` and `Llama2App`.
  98. We deprecated these apps. The reason for this decision was that it was hard to switch from to a different LLM, embedder or vector db, if you one day decided that that's what you want to do.
  99. The new app allows drop-in replacements, such as changing `App(llm=OpenAiLlm())` to `App(llm=Llama2Llm())`.
  100. To make the switch to our new, fully configurable app easier, we provide you with full examples for what the old classes would look like implemented as a new app.
  101. You can swap these in, and if you decide you want to try a different model one day, you don't have to rewrite your whole bot.
  102. #### App
  103. App without any configuration is still using the best options available, so you can keep using:
  104. ```python
  105. from embedchain import App
  106. app = App()
  107. ```
  108. #### OpenSourceApp
  109. Use this snippet to run an open source app.
  110. ```python
  111. from embedchain import App
  112. from embedchain.llm.gpt4all import GPT4ALLLlm
  113. from embedchain.embedder.gpt4all import GPT4AllEmbedder
  114. from embedchain.vectordb.chroma import ChromaDB
  115. app = App(llm=GPT4ALLLlm(), embedder=GPT4AllEmbedder(), db=ChromaDB())
  116. ```
  117. #### Llama2App
  118. ```python
  119. from embedchain import App
  120. from embedchain.llm.llama2 import Llama2Llm
  121. app = App(llm=Llama2Llm())
  122. ```
  123. #### CustomApp
  124. Every app is a custom app now.
  125. If you were previously using a `CustomApp`, you can now just change it to `App`.
  126. Here's one example, what you could do if we combined everything shown on this page.
  127. ```python
  128. from embedchain import App
  129. from embedchain.config import ElasticsearchDBConfig, EmbedderConfig, LlmConfig
  130. from embedchain.embedder.openai import OpenAiEmbedder
  131. from embedchain.llm.llama2 import Llama2Llm
  132. from embedchain.vectordb.elasticsearch import ElasticsearchDB
  133. app = App(
  134. llm=Llama2Llm(),
  135. llm_config=LlmConfig(number_documents=3, temperature=0),
  136. embedder=OpenAiEmbedder(),
  137. embedder_config=EmbedderConfig(model="text-embedding-ada-002"),
  138. db=ElasticsearchDB(),
  139. db_config=ElasticsearchDBConfig(),
  140. )
  141. ```
  142. ### Compatibility with other apps
  143. - If there is any other app instance in your script or app, you can change the import as
  144. ```python
  145. from embedchain import App as EmbedChainApp
  146. from embedchain import PersonApp as EmbedChainPersonApp
  147. # or
  148. from embedchain import App as ECApp
  149. from embedchain import PersonApp as ECPApp
  150. ```