llms.mdx 7.3 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262
  1. ---
  2. title: 🤖 Overview
  3. ---
  4. ## Overview
  5. Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
  6. <CardGroup cols={4}>
  7. <Card title="OpenAI" href="#openai"></Card>
  8. <Card title="Groq" href="#groq"></Card>
  9. <Card title="Together" href="#together"></Card>
  10. <Card title="AWS Bedrock" href="#aws_bedrock"></Card>
  11. <Card title="Litellm" href="#litellm"></Card>
  12. <Card title="Google AI" href="#google-ai"></Card>
  13. <Card title="Anthropic" href="#anthropic"></Card>
  14. <Card title="Mistral AI" href="#mistral-ai"></Card>
  15. <Card title="OpenAI Azure" href="#openai-azure"></Card>
  16. </CardGroup>
  17. ## OpenAI
  18. To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
  19. Once you have obtained the key, you can use it like this:
  20. ```python
  21. import os
  22. from mem0 import Memory
  23. os.environ["OPENAI_API_KEY"] = "your-api-key"
  24. config = {
  25. "llm": {
  26. "provider": "openai",
  27. "config": {
  28. "model": "gpt-4o",
  29. "temperature": 0.2,
  30. "max_tokens": 1500,
  31. }
  32. }
  33. }
  34. m = Memory.from_config(config)
  35. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  36. ```
  37. ## Groq
  38. [Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
  39. In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example.
  40. ```python
  41. import os
  42. from mem0 import Memory
  43. os.environ["GROQ_API_KEY"] = "your-api-key"
  44. config = {
  45. "llm": {
  46. "provider": "groq",
  47. "config": {
  48. "model": "mixtral-8x7b-32768",
  49. "temperature": 0.1,
  50. "max_tokens": 1000,
  51. }
  52. }
  53. }
  54. m = Memory.from_config(config)
  55. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  56. ```
  57. ## TogetherAI
  58. To use TogetherAI LLM models, you have to set the `TOGETHER_API_KEY` environment variable. You can obtain the TogetherAI API key from their [Account settings page](https://api.together.xyz/settings/api-keys).
  59. Once you have obtained the key, you can use it like this:
  60. ```python
  61. import os
  62. from mem0 import Memory
  63. os.environ["TOGETHER_API_KEY"] = "your-api-key"
  64. config = {
  65. "llm": {
  66. "provider": "togetherai",
  67. "config": {
  68. "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
  69. "temperature": 0.2,
  70. "max_tokens": 1500,
  71. }
  72. }
  73. }
  74. m = Memory.from_config(config)
  75. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  76. ```
  77. ## AWS Bedrock
  78. ### Setup
  79. - Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
  80. - You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials)
  81. - You will have to export `AWS_REGION`, `AWS_ACCESS_KEY`, and `AWS_SECRET_ACCESS_KEY` to set environment variables.
  82. ```python
  83. import os
  84. from mem0 import Memory
  85. os.environ['AWS_REGION'] = 'us-east-1'
  86. os.environ["AWS_ACCESS_KEY"] = "xx"
  87. os.environ["AWS_SECRET_ACCESS_KEY"] = "xx"
  88. config = {
  89. "llm": {
  90. "provider": "aws_bedrock",
  91. "config": {
  92. "model": "arn:aws:bedrock:us-east-1:123456789012:model/your-model-name",
  93. "temperature": 0.2,
  94. "max_tokens": 1500,
  95. }
  96. }
  97. }
  98. m = Memory.from_config(config)
  99. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  100. ```
  101. ## Litellm
  102. [Litellm](https://litellm.vercel.app/docs/) is compatible with over 100 large language models (LLMs), all using a standardized input/output format. You can explore the [available models]((https://litellm.vercel.app/docs/providers)) to use with Litellm. Ensure you set the `API_KEY` for the model you choose to use.
  103. ```python
  104. import os
  105. from mem0 import Memory
  106. os.environ["OPENAI_API_KEY"] = "your-api-key"
  107. config = {
  108. "llm": {
  109. "provider": "litellm",
  110. "config": {
  111. "model": "gpt-3.5-turbo",
  112. "temperature": 0.2,
  113. "max_tokens": 1500,
  114. }
  115. }
  116. }
  117. m = Memory.from_config(config)
  118. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  119. ```
  120. ## Google AI
  121. To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey)
  122. Once you have obtained the key, you can use it like this:
  123. ```python
  124. import os
  125. from mem0 import Memory
  126. os.environ["GEMINI_API_KEY"] = "your-api-key"
  127. config = {
  128. "llm": {
  129. "provider": "litellm",
  130. "config": {
  131. "model": "gemini/gemini-pro",
  132. "temperature": 0.2,
  133. "max_tokens": 1500,
  134. }
  135. }
  136. }
  137. m = Memory.from_config(config)
  138. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  139. ```
  140. ## Anthropic
  141. To use anthropic's models, please set the `ANTHROPIC_API_KEY` which you find on their [Account Settings Page](https://console.anthropic.com/account/keys).
  142. ```python
  143. import os
  144. from mem0 import Memory
  145. os.environ["ANTHROPIC_API_KEY"] = "your-api-key"
  146. config = {
  147. "llm": {
  148. "provider": "litellm",
  149. "config": {
  150. "model": "claude-3-opus-20240229",
  151. "temperature": 0.1,
  152. "max_tokens": 2000,
  153. }
  154. }
  155. }
  156. m = Memory.from_config(config)
  157. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  158. ```
  159. ## Mistral AI
  160. To use mistral's models, please Obtain the Mistral AI api key from their [console](https://console.mistral.ai/). Set the `MISTRAL_API_KEY` environment variable to use the model as given below in the example.
  161. ```python
  162. import os
  163. from mem0 import Memory
  164. os.environ["MISTRAL_API_KEY"] = "your-api-key"
  165. config = {
  166. "llm": {
  167. "provider": "litellm",
  168. "config": {
  169. "model": "open-mixtral-8x7b",
  170. "temperature": 0.1,
  171. "max_tokens": 2000,
  172. }
  173. }
  174. }
  175. m = Memory.from_config(config)
  176. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  177. ```
  178. ## OpenAI Azure
  179. To use Azure AI models, you have to set the `AZURE_API_KEY`, `AZURE_API_BASE`, and `AZURE_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/).
  180. ```python
  181. import os
  182. from mem0 import Memory
  183. os.environ["AZURE_API_KEY"] = "your-api-key"
  184. # Needed to use custom models
  185. os.environ["AZURE_API_BASE"] = "your-api-base-url"
  186. os.environ["AZURE_API_VERSION"] = "version-to-use"
  187. config = {
  188. "llm": {
  189. "provider": "litellm",
  190. "config": {
  191. "model": "azure_ai/command-r-plus",
  192. "temperature": 0.1,
  193. "max_tokens": 2000,
  194. }
  195. }
  196. }
  197. m = Memory.from_config(config)
  198. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  199. ```