llms.mdx 4.4 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150
  1. ---
  2. title: 🤖 Overview
  3. ---
  4. ## Overview
  5. Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
  6. <CardGroup cols={4}>
  7. <Card title="OpenAI" href="#openai"></Card>
  8. <Card title="Groq" href="#groq"></Card>
  9. <Card title="Together" href="#together"></Card>
  10. <Card title="AWS Bedrock" href="#aws_bedrock"></Card>
  11. <Card title="Litellm" href="#litellm"></Card>
  12. </CardGroup>
  13. ## OpenAI
  14. To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
  15. Once you have obtained the key, you can use it like this:
  16. ```python
  17. import os
  18. from mem0 import Memory
  19. os.environ['OPENAI_API_KEY'] = 'xxx'
  20. config = {
  21. "llm": {
  22. "provider": "openai",
  23. "config": {
  24. "model": "gpt-4o",
  25. "temperature": 0.2,
  26. "max_tokens": 1500,
  27. }
  28. }
  29. }
  30. m = Memory.from_config(config)
  31. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  32. ```
  33. ## Groq
  34. [Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
  35. In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example.
  36. ```python
  37. import os
  38. from mem0 import Memory
  39. os.environ['GROQ_API_KEY'] = 'xxx'
  40. config = {
  41. "llm": {
  42. "provider": "groq",
  43. "config": {
  44. "model": "mixtral-8x7b-32768",
  45. "temperature": 0.1,
  46. "max_tokens": 1000,
  47. }
  48. }
  49. }
  50. m = Memory.from_config(config)
  51. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  52. ```
  53. ## TogetherAI
  54. To use TogetherAI LLM models, you have to set the `TOGETHER_API_KEY` environment variable. You can obtain the TogetherAI API key from their [Account settings page](https://api.together.xyz/settings/api-keys).
  55. Once you have obtained the key, you can use it like this:
  56. ```python
  57. import os
  58. from mem0 import Memory
  59. os.environ['TOGETHER_API_KEY'] = 'xxx'
  60. config = {
  61. "llm": {
  62. "provider": "togetherai",
  63. "config": {
  64. "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
  65. "temperature": 0.2,
  66. "max_tokens": 1500,
  67. }
  68. }
  69. }
  70. m = Memory.from_config(config)
  71. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  72. ```
  73. ## AWS Bedrock
  74. ### Setup
  75. - Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
  76. - You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials)
  77. - You will have to export `AWS_REGION`, `AWS_ACCESS_KEY`, and `AWS_SECRET_ACCESS_KEY` to set environment variables.
  78. ```python
  79. import os
  80. from mem0 import Memory
  81. os.environ['AWS_REGION'] = 'us-east-1'
  82. os.environ["AWS_ACCESS_KEY"] = "xx"
  83. os.environ["AWS_SECRET_ACCESS_KEY"] = "xx"
  84. config = {
  85. "llm": {
  86. "provider": "aws_bedrock",
  87. "config": {
  88. "model": "arn:aws:bedrock:us-east-1:123456789012:model/your-model-name",
  89. "temperature": 0.2,
  90. "max_tokens": 1500,
  91. }
  92. }
  93. }
  94. m = Memory.from_config(config)
  95. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  96. ```
  97. ## Litellm
  98. [Litellm](https://litellm.vercel.app/docs/) is compatible with over 100 large language models (LLMs), all using a standardized input/output format. You can explore the [available models]((https://litellm.vercel.app/docs/providers)) to use with Litellm. Ensure you set the `API_KEY` for the model you choose to use.
  99. ```python
  100. import os
  101. from mem0 import Memory
  102. config = {
  103. "llm": {
  104. "provider": "litellm",
  105. "config": {
  106. "model": "gpt-3.5-turbo",
  107. "temperature": 0.2,
  108. "max_tokens": 1500,
  109. }
  110. }
  111. }
  112. m = Memory.from_config(config)
  113. m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
  114. ```