浏览代码

Merge pull request #66 from nanobrowser/bugfix

Bugfix
Ashu 4 月之前
父节点
当前提交
a74817ebd5
共有 3 个文件被更改,包括 31 次插入20 次删除
  1. 25 18
      README.md
  2. 5 1
      chrome-extension/src/background/agent/helper.ts
  3. 1 1
      pages/options/src/components/ModelSettings.tsx

+ 25 - 18
README.md

@@ -15,7 +15,7 @@
 
 Nanobrowser is an open-source AI web automation tool that runs in your browser. A free alternative to OpenAI Operator with flexible LLM options and multi-agent system.
 
-⬇️ Get [Nanobrowser](https://github.com/nanobrowser/nanobrowser/releases) for free
+⬇️ Get [Nanobrowser from Chrome Web Store](https://chromewebstore.google.com/detail/nanobrowser/imbddededgmcgfhfpcjmijokokekbkal) for free
 
 👏 Join the community in [Discord](https://discord.gg/NN3ABHggMK) | [X](https://x.com/nanobrowser_ai)
 
@@ -50,6 +50,21 @@ Looking for a powerful AI web agent without the $200/month price tag of OpenAI O
 
 ## 🚀 Quick Start
 
+1. **Install from Chrome Web Store**:
+   * Visit the [Nanobrowser Chrome Web Store page](https://chromewebstore.google.com/detail/nanobrowser/imbddededgmcgfhfpcjmijokokekbkal)
+   * Click "Add to Chrome" button
+   * Confirm the installation when prompted
+
+2. **Configure Agent Models**:
+   * Click the Nanobrowser icon in your toolbar to open the sidebar
+   * Click the `Settings` icon (top right)
+   * Add your LLM API keys
+   * Choose which model to use for different agents (Navigator, Planner, Validator)
+
+## 🔧 Manually Install
+
+If you prefer to manually install the development version from GitHub:
+
 1. **Download**
     * Download the latest `nanobrowser.zip` file from the official Github [release page](https://github.com/nanobrowser/nanobrowser/releases).
 
@@ -61,23 +76,15 @@ Looking for a powerful AI web agent without the $200/month price tag of OpenAI O
     * Select the unzipped `nanobrowser` folder.
 
 3. **Configure Agent Models**
-    *   Click the Nanobrowser icon in your toolbar to open the sidebar
-    *   Click the `Settings` icon (top right).
-    *   Add your LLM API keys.
-    *   Choose which model to use for different agents (Navigator, Planner, Validator)
-
-## 🔄 Upgrading
-
-1. **Download**:
-    * Download the latest `nanobrowser.zip` file from the official Github [release page](https://github.com/nanobrowser/nanobrowser/releases).
-
-2. **Replace**:
-    * Unzip `nanobrowser.zip`.
-    * Replace your existing Nanobrowser files with the new ones.
+    * Click the Nanobrowser icon in your toolbar to open the sidebar
+    * Click the `Settings` icon (top right).
+    * Add your LLM API keys.
+    * Choose which model to use for different agents (Navigator, Planner, Validator)
 
-3. **Refresh**:
-    * Go to `chrome://extensions/` in Chrome.
-    * Click the refresh icon on the Nanobrowser card.
+4. **Upgrading**:
+    * Download the latest `nanobrowser.zip` file from the release page.
+    * Unzip and replace your existing Nanobrowser files with the new ones.
+    * Go to `chrome://extensions/` in Chrome and click the refresh icon on the Nanobrowser card.
 
 ## 🛠️ Build from Source
 
@@ -105,7 +112,7 @@ If you prefer to build Nanobrowser yourself, follow these steps:
 
 5. **Load the Extension**:
    * The built extension will be in the `dist` directory
-   * Follow the installation steps from the Quick Start section to load the extension into your browser
+   * Follow the installation steps from the Manually Install section to load the extension into your browser
 
 6. **Development Mode** (optional):
    ```bash

+ 5 - 1
chrome-extension/src/background/agent/helper.ts

@@ -117,7 +117,11 @@ export function createChatModel(providerConfig: ProviderConfig, modelConfig: Mod
         topP,
         temperature,
         maxTokens,
-        numCtx: 128000,
+        // ollama usually has a very small context window, so we need to set a large number for agent to work
+        // It was set to 128000 in the original code, but it will cause ollama reload the models frequently if you have multiple models working together
+        // not sure why, but setting it to 64000 seems to work fine
+        // TODO: configure the context window size in model config
+        numCtx: 64000,
       };
       return new ChatOllama(args);
     }

+ 1 - 1
pages/options/src/components/ModelSettings.tsx

@@ -530,7 +530,7 @@ export const ModelSettings = ({ isDarkMode = false }: ModelSettingsProps) => {
           <select
             id={`${agentName}-model`}
             className={`flex-1 rounded-md border text-sm ${isDarkMode ? 'border-slate-600 bg-slate-700 text-gray-200' : 'border-gray-300 bg-white text-gray-700'} px-3 py-2`}
-            disabled={availableModels.length <= 1}
+            disabled={availableModels.length === 0}
             value={
               selectedModels[agentName]
                 ? `${getProviderForModel(selectedModels[agentName])}>${selectedModels[agentName]}`