123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262 |
- ---
- title: 🤖 Overview
- ---
- ## Overview
- Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
- <CardGroup cols={4}>
- <Card title="OpenAI" href="#openai"></Card>
- <Card title="Groq" href="#groq"></Card>
- <Card title="Together" href="#together"></Card>
- <Card title="AWS Bedrock" href="#aws_bedrock"></Card>
- <Card title="Litellm" href="#litellm"></Card>
- <Card title="Google AI" href="#google-ai"></Card>
- <Card title="Anthropic" href="#anthropic"></Card>
- <Card title="Mistral AI" href="#mistral-ai"></Card>
- <Card title="OpenAI Azure" href="#openai-azure"></Card>
- </CardGroup>
- ## OpenAI
- To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
- Once you have obtained the key, you can use it like this:
- ```python
- import os
- from mem0 import Memory
- os.environ["OPENAI_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "openai",
- "config": {
- "model": "gpt-4o",
- "temperature": 0.2,
- "max_tokens": 1500,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## Groq
- [Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
- In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example.
- ```python
- import os
- from mem0 import Memory
- os.environ["GROQ_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "groq",
- "config": {
- "model": "mixtral-8x7b-32768",
- "temperature": 0.1,
- "max_tokens": 1000,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## TogetherAI
- To use TogetherAI LLM models, you have to set the `TOGETHER_API_KEY` environment variable. You can obtain the TogetherAI API key from their [Account settings page](https://api.together.xyz/settings/api-keys).
- Once you have obtained the key, you can use it like this:
- ```python
- import os
- from mem0 import Memory
- os.environ["TOGETHER_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "togetherai",
- "config": {
- "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
- "temperature": 0.2,
- "max_tokens": 1500,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## AWS Bedrock
- ### Setup
- - Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
- - You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials)
- - You will have to export `AWS_REGION`, `AWS_ACCESS_KEY`, and `AWS_SECRET_ACCESS_KEY` to set environment variables.
- ```python
- import os
- from mem0 import Memory
- os.environ['AWS_REGION'] = 'us-east-1'
- os.environ["AWS_ACCESS_KEY"] = "xx"
- os.environ["AWS_SECRET_ACCESS_KEY"] = "xx"
- config = {
- "llm": {
- "provider": "aws_bedrock",
- "config": {
- "model": "arn:aws:bedrock:us-east-1:123456789012:model/your-model-name",
- "temperature": 0.2,
- "max_tokens": 1500,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## Litellm
- [Litellm](https://litellm.vercel.app/docs/) is compatible with over 100 large language models (LLMs), all using a standardized input/output format. You can explore the [available models]((https://litellm.vercel.app/docs/providers)) to use with Litellm. Ensure you set the `API_KEY` for the model you choose to use.
- ```python
- import os
- from mem0 import Memory
- os.environ["OPENAI_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "litellm",
- "config": {
- "model": "gpt-3.5-turbo",
- "temperature": 0.2,
- "max_tokens": 1500,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## Google AI
- To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey)
- Once you have obtained the key, you can use it like this:
- ```python
- import os
- from mem0 import Memory
- os.environ["GEMINI_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "litellm",
- "config": {
- "model": "gemini/gemini-pro",
- "temperature": 0.2,
- "max_tokens": 1500,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## Anthropic
- To use anthropic's models, please set the `ANTHROPIC_API_KEY` which you find on their [Account Settings Page](https://console.anthropic.com/account/keys).
- ```python
- import os
- from mem0 import Memory
- os.environ["ANTHROPIC_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "litellm",
- "config": {
- "model": "claude-3-opus-20240229",
- "temperature": 0.1,
- "max_tokens": 2000,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## Mistral AI
- To use mistral's models, please Obtain the Mistral AI api key from their [console](https://console.mistral.ai/). Set the `MISTRAL_API_KEY` environment variable to use the model as given below in the example.
- ```python
- import os
- from mem0 import Memory
- os.environ["MISTRAL_API_KEY"] = "your-api-key"
- config = {
- "llm": {
- "provider": "litellm",
- "config": {
- "model": "open-mixtral-8x7b",
- "temperature": 0.1,
- "max_tokens": 2000,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
- ## OpenAI Azure
- To use Azure AI models, you have to set the `AZURE_API_KEY`, `AZURE_API_BASE`, and `AZURE_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/).
- ```python
- import os
- from mem0 import Memory
- os.environ["AZURE_API_KEY"] = "your-api-key"
- # Needed to use custom models
- os.environ["AZURE_API_BASE"] = "your-api-base-url"
- os.environ["AZURE_API_VERSION"] = "version-to-use"
- config = {
- "llm": {
- "provider": "litellm",
- "config": {
- "model": "azure_ai/command-r-plus",
- "temperature": 0.1,
- "max_tokens": 2000,
- }
- }
- }
- m = Memory.from_config(config)
- m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
- ```
|