Ver código fonte

feat: Add private ai example (#1101)

Taranjeet Singh 1 ano atrás
pai
commit
2f6ba642c7

+ 26 - 0
examples/private-ai/README.md

@@ -0,0 +1,26 @@
+# Private AI
+
+In this example, we will create a private AI using embedchain.
+
+Private AI is useful when you want to chat with your data and you dont want to spend money and your data should stay on your machine.
+
+## How to install
+
+First create a virtual environment and install the requirements by running
+
+```bash
+pip install -r requirements.txt
+```
+
+## How to use
+
+* Now open privateai.py file and change the line `app.add` to point to your directory or data source.
+* If you want to add any other data type, you can browse the supported data types [here](https://docs.embedchain.ai/components/data-sources/overview)
+
+* Now simply run the file by
+
+```bash
+python privateai.py
+```
+
+* Now you can enter and ask any questions from your data.

+ 10 - 0
examples/private-ai/config.yaml

@@ -0,0 +1,10 @@
+llm:
+  provider: gpt4all
+  config:
+    model: 'orca-mini-3b-gguf2-q4_0.gguf'
+    max_tokens: 1000
+    top_p: 1
+embedder:
+  provider: huggingface
+  config:
+    model: 'sentence-transformers/all-MiniLM-L6-v2'

+ 15 - 0
examples/private-ai/privateai.py

@@ -0,0 +1,15 @@
+from embedchain import App
+
+app = App.from_config("config.yaml")
+app.add("/path/to/your/folder", data_type="directory")
+
+while True:
+    user_input = input("Enter your question (type 'exit' to quit): ")
+
+    # Break the loop if the user types 'exit'
+    if user_input.lower() == 'exit':
+        break
+
+    # Process the input and provide a response
+    response = app.chat(user_input)
+    print(response)

+ 1 - 0
examples/private-ai/requirements.txt

@@ -0,0 +1 @@
+"embedchain[opensource]"