app_types.mdx 6.0 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215
  1. ---
  2. title: '📱 App types'
  3. ---
  4. ## App Types
  5. Embedchain supports a variety of LLMs, embedding functions/models and vector databases.
  6. Our app gives you full control over which components you want to use, you can mix and match them to your hearts content.
  7. <Tip>
  8. Out of the box, if you just use `app = App()`, Embedchain uses what we believe to be the best configuration available. This might include paid/proprietary components. Currently, this is
  9. * LLM: OpenAi (gpt-3.5-turbo)
  10. * Embedder: OpenAi (text-embedding-ada-002)
  11. * Database: ChromaDB
  12. </Tip>
  13. ### LLM
  14. #### Choosing an LLM
  15. The following LLM providers are supported by Embedchain:
  16. - OPENAI
  17. - ANTHPROPIC
  18. - VERTEX_AI
  19. - GPT4ALL
  20. - AZURE_OPENAI
  21. - LLAMA2
  22. - JINA
  23. - COHERE
  24. You can choose one by importing it from `embedchain.llm`. E.g.:
  25. ```python
  26. from embedchain import App
  27. from embedchain.llm.llama2 import Llama2Llm
  28. app = App(llm=Llama2Llm())
  29. ```
  30. #### Configuration
  31. The LLMs can be configured by passing an LlmConfig object.
  32. The config options can be found [here](/advanced/query_configuration#llmconfig)
  33. ```python
  34. from embedchain import App
  35. from embedchain.llm.llama2 import Llama2Llm
  36. from embedchain.config import LlmConfig
  37. app = App(llm=Llama2Llm(), llm_config=LlmConfig(number_documents=3, temperature=0))
  38. ```
  39. ### Embedder
  40. #### Choosing an Embedder
  41. The following providers for embedding functions are supported by Embedchain:
  42. - OPENAI
  43. - HUGGING_FACE
  44. - VERTEX_AI
  45. - GPT4ALL
  46. - AZURE_OPENAI
  47. You can choose one by importing it from `embedchain.embedder`. E.g.:
  48. ```python
  49. from embedchain import App
  50. from embedchain.embedder.vertexai import VertexAiEmbedder
  51. app = App(embedder=VertexAiEmbedder())
  52. ```
  53. #### Configuration
  54. The LLMs can be configured by passing an EmbedderConfig object.
  55. ```python
  56. from embedchain import App
  57. from embedchain.embedder.openai import OpenAiEmbedder
  58. from embedchain.config import EmbedderConfig
  59. app = App(embedder=OpenAiEmbedder(), embedder_config=EmbedderConfig(model="text-embedding-ada-002"))
  60. ```
  61. <Tip>
  62. You can also pass an `LlmConfig` instance directly to the `query` or `chat` method.
  63. This creates a temporary config for that request alone, so you could, for example, use a different model (from the same provider) or get more context documents for a specific query.
  64. </Tip>
  65. ### Vector Database
  66. #### Choosing a Vector Database
  67. The following vector databases are supported by Embedchain:
  68. - ChromaDB
  69. - Elasticsearch
  70. You can choose one by importing it from `embedchain.vectordb`. E.g.:
  71. ```python
  72. from embedchain import App
  73. from embedchain.vectordb.elasticsearch import ElasticsearchDB
  74. app = App(db=ElasticsearchDB())
  75. ```
  76. #### Configuration
  77. The vector databases can be configured by passing a specific config object.
  78. These vary greatly between the different vector databases.
  79. ```python
  80. from embedchain import App
  81. from embedchain.vectordb.elasticsearch import ElasticsearchDB
  82. from embedchain.config import ElasticsearchDBConfig
  83. app = App(db=ElasticsearchDB(), db_config=ElasticsearchDBConfig())
  84. ```
  85. ### PersonApp
  86. ```python
  87. from embedchain import PersonApp
  88. naval_chat_bot = PersonApp("name_of_person_or_character") #Like "Yoda"
  89. ```
  90. - `PersonApp` uses OpenAI's model, so these are paid models. 💸 You will be charged for embedding model usage and LLM usage.
  91. - `PersonApp` uses OpenAI's embedding model to create embeddings for chunks and ChatGPT API as LLM to get answer given the relevant docs. Make sure that you have an OpenAI account and an API key. If you don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
  92. - Once you have the API key, set it in an environment variable called `OPENAI_API_KEY`
  93. ```python
  94. import os
  95. os.environ["OPENAI_API_KEY"] = "sk-xxxx"
  96. ```
  97. ### Full Configuration Examples
  98. Embedchain previously offered fully configured classes, namely `App`, `OpenSourceApp`, `CustomApp` and `Llama2App`.
  99. We deprecated these apps. The reason for this decision was that it was hard to switch from to a different LLM, embedder or vector db, if you one day decided that that's what you want to do.
  100. The new app allows drop-in replacements, such as changing `App(llm=OpenAiLlm())` to `App(llm=Llama2Llm())`.
  101. To make the switch to our new, fully configurable app easier, we provide you with full examples for what the old classes would look like implemented as a new app.
  102. You can swap these in, and if you decide you want to try a different model one day, you don't have to rewrite your whole bot.
  103. #### App
  104. App without any configuration is still using the best options available, so you can keep using:
  105. ```python
  106. from embedchain import App
  107. app = App()
  108. ```
  109. #### OpenSourceApp
  110. Use this snippet to run an open source app.
  111. ```python
  112. from embedchain import App
  113. from embedchain.llm.gpt4all import GPT4ALLLlm
  114. from embedchain.embedder.gpt4all import GPT4AllEmbedder
  115. from embedchain.vectordb.chroma import ChromaDB
  116. app = App(llm=GPT4ALLLlm(), embedder=GPT4AllEmbedder(), db=ChromaDB())
  117. ```
  118. #### Llama2App
  119. ```python
  120. from embedchain import App
  121. from embedchain.llm.llama2 import Llama2Llm
  122. app = App(llm=Llama2Llm())
  123. ```
  124. #### CustomApp
  125. Every app is a custom app now.
  126. If you were previously using a `CustomApp`, you can now just change it to `App`.
  127. Here's one example, what you could do if we combined everything shown on this page.
  128. ```python
  129. from embedchain import App
  130. from embedchain.config import ElasticsearchDBConfig, EmbedderConfig, LlmConfig
  131. from embedchain.embedder.openai import OpenAiEmbedder
  132. from embedchain.llm.llama2 import Llama2Llm
  133. from embedchain.vectordb.elasticsearch import ElasticsearchDB
  134. app = App(
  135. llm=Llama2Llm(),
  136. llm_config=LlmConfig(number_documents=3, temperature=0),
  137. embedder=OpenAiEmbedder(),
  138. embedder_config=EmbedderConfig(model="text-embedding-ada-002"),
  139. db=ElasticsearchDB(),
  140. db_config=ElasticsearchDBConfig(),
  141. )
  142. ```
  143. ### Compatibility with other apps
  144. - If there is any other app instance in your script or app, you can change the import as
  145. ```python
  146. from embedchain import App as EmbedChainApp
  147. from embedchain import PersonApp as EmbedChainPersonApp
  148. # or
  149. from embedchain import App as ECApp
  150. from embedchain import PersonApp as ECPApp
  151. ```