Skip to content

Commit

Permalink
Doc changes and Storage fix (#2181)
Browse files Browse the repository at this point in the history
  • Loading branch information
Dev-Khant authored Jan 30, 2025
1 parent 63fbd2d commit a06c9a9
Show file tree
Hide file tree
Showing 10 changed files with 271 additions and 282 deletions.
194 changes: 61 additions & 133 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,161 +45,103 @@

[Mem0](https://mem0.ai) (pronounced as "mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. Mem0 remembers user preferences, adapts to individual needs, and continuously improves over time, making it ideal for customer support chatbots, AI assistants, and autonomous systems.

<!-- Start of Selection -->
<p style="display: flex;">
<span style="font-size: 1.2em;">New Feature: Introducing Graph Memory. Check out our <a href="https://docs.mem0.ai/open-source/graph-memory" target="_blank">documentation</a>.</span>
</p>
<!-- End of Selection -->


### Core Features

- **Multi-Level Memory**: User, Session, and AI Agent memory retention
- **Adaptive Personalization**: Continuous improvement based on interactions
- **Developer-Friendly API**: Simple integration into various applications
- **Cross-Platform Consistency**: Uniform behavior across devices
- **Managed Service**: Hassle-free hosted solution

### How Mem0 works?

Mem0 leverages a hybrid database approach to manage and retrieve long-term memories for AI agents and assistants. Each memory is associated with a unique identifier, such as a user ID or agent ID, allowing Mem0 to organize and access memories specific to an individual or context.

When a message is added to the Mem0 using add() method, the system extracts relevant facts and preferences and stores it across data stores: a vector database, a key-value database, and a graph database. This hybrid approach ensures that different types of information are stored in the most efficient manner, making subsequent searches quick and effective.
### Features & Use Cases

When an AI agent or LLM needs to recall memories, it uses the search() method. Mem0 then performs search across these data stores, retrieving relevant information from each source. This information is then passed through a scoring layer, which evaluates their importance based on relevance, importance, and recency. This ensures that only the most personalized and useful context is surfaced.
Core Capabilities:
- **Multi-Level Memory**: User, Session, and AI Agent memory retention with adaptive personalization
- **Developer-Friendly**: Simple API integration, cross-platform consistency, and hassle-free managed service

The retrieved memories can then be appended to the LLM's prompt as needed, enhancing the personalization and relevance of its responses.

### Use Cases

Mem0 empowers organizations and individuals to enhance:

- **AI Assistants and agents**: Seamless conversations with a touch of déjà vu
- **Personalized Learning**: Tailored content recommendations and progress tracking
- **Customer Support**: Context-aware assistance with user preference memory
- **Healthcare**: Patient history and treatment plan management
- **Virtual Companions**: Deeper user relationships through conversation memory
- **Productivity**: Streamlined workflows based on user habits and task history
- **Gaming**: Adaptive environments reflecting player choices and progress
Applications:
- **AI Assistants**: Seamless conversations with context and personalization
- **Learning & Support**: Tailored content recommendations and context-aware customer assistance
- **Healthcare & Companions**: Patient history tracking and deeper relationship building
- **Productivity & Gaming**: Streamlined workflows and adaptive environments based on user behavior

## Get Started

The easiest way to set up Mem0 is through the managed [Mem0 Platform](https://app.mem0.ai). This hosted solution offers automatic updates, advanced analytics, and dedicated support. [Sign up](https://app.mem0.ai) to get started.
Get started quickly with [Mem0 Platform](https://app.mem0.ai) - our fully managed solution that provides automatic updates, advanced analytics, enterprise security, and dedicated support. [Create a free account](https://app.mem0.ai) to begin.

If you prefer to self-host, use the open-source Mem0 package. Follow the [installation instructions](#install) to get started.
For complete control, you can self-host Mem0 using our open-source package. See the [Quickstart guide](#quickstart) below to set up your own instance.

## Installation Instructions <a name="install"></a>
## Quickstart Guide <a name="quickstart"></a>

Install the Mem0 package via pip:

```bash
pip install mem0ai
```

Alternatively, you can use Mem0 with one click on the hosted platform [here](https://app.mem0.ai/).

### Basic Usage

Mem0 requires an LLM to function, with `gpt-4o` from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our [Supported LLMs documentation](https://docs.mem0.ai/llms).

First step is to instantiate the memory:

```python
from openai import OpenAI
from mem0 import Memory

m = Memory()
openai_client = OpenAI()
mem0 = Memory()

def chat_with_memories(message: str, user_id: str = "default_user") -> str:
# Retrieve relevant memories
relevant_memories = mem0.search(query=message, user_id=user_id, limit=3)
memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories)

# Generate Assistant response
system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
assistant_response = response.choices[0].message.content

# Create new memories from the conversation
messages.append({"role": "assistant", "content": assistant_response})
mem0.add(messages, user_id=user_id)

return assistant_response

def main():
print("Chat with AI (type 'exit' to quit)")
while True:
user_input = input("You: ").strip()
if user_input.lower() == 'exit':
print("Goodbye!")
break
print(f"AI: {chat_with_memories(user_input)}")

if __name__ == "__main__":
main()
```

<details>
<summary>How to set OPENAI_API_KEY</summary>

```python
import os
os.environ["OPENAI_API_KEY"] = "sk-xxx"
```
</details>


You can perform the following task on the memory:

1. Add: Store a memory from any unstructured text
2. Update: Update memory of a given memory_id
3. Search: Fetch memories based on a query
4. Get: Return memories for a certain user/agent/session
5. History: Describe how a memory has changed over time for a specific memory ID

```python
# 1. Add: Store a memory from any unstructured text
result = m.add("I am working on improving my tennis skills. Suggest some online courses.", user_id="alice", metadata={"category": "hobbies"})

# Created memory --> 'Improving her tennis skills.' and 'Looking for online suggestions.'
```

```python
# 2. Update: update the memory
result = m.update(memory_id=<memory_id_1>, data="Likes to play tennis on weekends")

# Updated memory --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
```

```python
# 3. Search: search related memories
related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")

# Retrieved memory --> 'Likes to play tennis on weekends'
```

```python
# 4. Get all memories
all_memories = m.get_all()
memory_id = all_memories["memories"][0] ["id"] # get a memory_id

# All memory items --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
```

```python
# 5. Get memory history for a particular memory_id
history = m.history(memory_id=<memory_id_1>)

# Logs corresponding to memory_id_1 --> {'prev_value': 'Working on improving tennis skills and interested in online courses for tennis.', 'new_value': 'Likes to play tennis on weekends' }
```
For more advanced usage and API documentation, visit our [documentation](https://docs.mem0.ai).

> [!TIP]
> If you prefer a hosted version without the need to set up infrastructure yourself, check out the [Mem0 Platform](https://app.mem0.ai/) to get started in minutes.
> For a hassle-free experience, try our [hosted platform](https://app.mem0.ai) with automatic updates and enterprise features.
## Demos

### Graph Memory
To initialize Graph Memory you'll need to set up your configuration with graph store providers.
Currently, we support Neo4j as a graph store provider. You can setup [Neo4j](https://neo4j.com/) locally or use the hosted [Neo4j AuraDB](https://neo4j.com/product/auradb/).
Moreover, you also need to set the version to `v1.1` (*prior versions are not supported*).
Here's how you can do it:
- AI Companion: Experience personalized conversations with an AI that remembers your preferences and past interactions

```python
from mem0 import Memory

config = {
"graph_store": {
"provider": "neo4j",
"config": {
"url": "neo4j+s://xxx",
"username": "neo4j",
"password": "xxx"
}
},
"version": "v1.1"
}
![AI Companion Demo](https://github.com/user-attachments/assets/46e60f82-682f-4157-a8de-215193a04baa)

m = Memory.from_config(config_dict=config)
<br/><br/>

```
- Enhance your AI interactions by storing memories across ChatGPT, Perplexity, and Claude using our browser extension.

## Documentation
![Chrome Extension Demo](https://github.com/user-attachments/assets/b170d458-c020-47f7-9f1c-78211200ad2c)

For detailed usage instructions and API reference, visit our documentation at [docs.mem0.ai](https://docs.mem0.ai). Here, you can find more information on both the open-source version and the hosted [Mem0 Platform](https://app.mem0.ai).

## Star History
## Documentation

[![Star History Chart](https://api.star-history.com/svg?repos=mem0ai/mem0&type=Date)](https://star-history.com/#mem0ai/mem0&Date)
For detailed usage instructions and API reference, visit our [documentation](https://docs.mem0.ai). You'll find:
- Complete API reference
- Integration guides
- Advanced configuration options
- Best practices and examples
- More details about:
- Open-source version
- [Hosted Mem0 Platform](https://app.mem0.ai)

## Support

Expand All @@ -209,20 +151,6 @@ Join our community for support and discussions. If you have any questions, feel
- [Follow us on Twitter](https://x.com/mem0ai)
- [Email founders](mailto:[email protected])

## Contributors

Join our [Discord community](https://mem0.dev/DiG) to learn about memory management for AI agents and LLMs, and connect with Mem0 users and contributors. Share your ideas, questions, or feedback in our [GitHub Issues](https://github.com/mem0ai/mem0/issues).

We value and appreciate the contributions of our community. Special thanks to our contributors for helping us improve Mem0.

<a href="https://github.com/mem0ai/mem0/graphs/contributors">
<img src="https://contrib.rocks/image?repo=mem0ai/mem0" />
</a>

## Anonymous Telemetry

We collect anonymous usage metrics to enhance our package's quality and user experience. This includes data like feature usage frequency and system info, but never personal details. The data helps us prioritize improvements and ensure compatibility. If you wish to opt-out, set the environment variable MEM0_TELEMETRY=false. We prioritize data security and don't share this data externally.

## License

This project is licensed under the Apache 2.0 License - see the [LICENSE](LICENSE) file for details.
File renamed without changes.
7 changes: 3 additions & 4 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,6 @@
"pages": [
"overview",
"quickstart",
"playground",
"features"
]
},
Expand All @@ -65,8 +64,7 @@
{
"group": "Features",
"pages": ["features/selective-memory", "features/custom-categories", "features/custom-instructions", "features/direct-import", "features/async-client", "features/memory-export"]
},
"features/langchain-tools"
}
]
},
{
Expand Down Expand Up @@ -221,7 +219,8 @@
"integrations/autogen",
"integrations/langchain",
"integrations/langgraph",
"integrations/llama-index"
"integrations/llama-index",
"integrations/langchain-tools"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Mem0 offers two powerful ways to leverage our technology: our [managed platform]
<Card title="Playground" icon="play" href="playground">
Mem0 in action
</Card>
<Card title="Examples" icon="lightbulb" href="/open-source/quickstart">
<Card title="Examples" icon="lightbulb" href="/examples">
See what you can build with Mem0
</Card>
</CardGroup>
Expand Down
2 changes: 1 addition & 1 deletion docs/platform/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,6 @@ Check out our [Platform Guide](/platform/guide) to start using Mem0 platform qui
## Next Steps

- Sign up to the [Mem0 Platform](https://mem0.dev/pd)
- Join our [Discord](https://mem0.dev/Did) or [Slack](https://mem0.ai/slack) with other developers and get support.
- Join our [Discord](https://mem0.dev/Did) or [Slack](https://mem0.dev/slack) with other developers and get support.

We're excited to see what you'll build with Mem0 Platform. Let's create smarter, more personalized AI experiences together!
24 changes: 21 additions & 3 deletions docs/platform/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,12 @@ npm install mem0ai

<CodeGroup>
```python Python
import os
from mem0 import MemoryClient
client = MemoryClient(api_key="your-api-key")

os.environ["MEM0_API_KEY"] = "your-api-key"

client = MemoryClient()
```

```javascript JavaScript
Expand All @@ -43,9 +47,12 @@ const client = new MemoryClient({ apiKey: 'your-api-key' });
For asynchronous operations in Python, you can use the AsyncMemoryClient:

```python Python
import os
from mem0 import AsyncMemoryClient

client = AsyncMemoryClient(api_key="your-api-key")
os.environ["MEM0_API_KEY"] = "your-api-key"

client = AsyncMemoryClient()


async def main():
Expand Down Expand Up @@ -1641,7 +1648,18 @@ curl -X POST "https://api.mem0.ai/v1/memories/" \
```
```json Output
{'message': 'ok'}
[{'id': '3f4eccba-3b09-497a-81ab-cca1ababb36b',
'memory': 'Is allergic to nuts',
'event': 'DELETE'},
{'id': 'f5dcfbf4-5f0b-422a-8ad4-cadb9e941e25',
'memory': 'Is a vegetarian',
'event': 'DELETE'},
{'id': 'dd32f70c-fa69-4fc7-997b-fb4a66d1a0fa',
'memory': 'Name is Alex',
'event': 'DELETE'}]{'message': 'ok'}
{'id': '3f4eccba-3b09-497a-81ab-cca1ababb36b',
'memory': 'Likes Chicken',
'event': 'DELETE'}
```
</CodeGroup>
Expand Down
25 changes: 0 additions & 25 deletions docs/playground.mdx

This file was deleted.

Loading

0 comments on commit a06c9a9

Please sign in to comment.