소스 검색

docs: fix typos (#548)

omahs 1 년 전
부모
커밋
60d5daaaf5
4개의 변경된 파일8개의 추가작업 그리고 8개의 파일을 삭제
  1. 3 3
      docs/advanced/app_types.mdx
  2. 2 2
      docs/advanced/data_types.mdx
  3. 2 2
      docs/advanced/interface_types.mdx
  4. 1 1
      docs/advanced/vector_database.mdx

+ 3 - 3
docs/advanced/app_types.mdx

@@ -14,7 +14,7 @@ app = App()
 ```
 
 - `App` uses OpenAI's model, so these are paid models. 💸 You will be charged for embedding model usage and LLM usage.
-- `App` uses OpenAI's embedding model to create embeddings for chunks and ChatGPT API as LLM to get answer given the relevant docs. Make sure that you have an OpenAI account and an API key. If you have don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
+- `App` uses OpenAI's embedding model to create embeddings for chunks and ChatGPT API as LLM to get answer given the relevant docs. Make sure that you have an OpenAI account and an API key. If you don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
 - `App` is opinionated. It uses the best embedding model and LLM on the market.
 - Once you have the API key, set it in an environment variable called `OPENAI_API_KEY`
 
@@ -49,7 +49,7 @@ zuck_bot.query("Who owns the new threads app and when it was founded?")
 ```
 
 - `Llama2App` uses Replicate's LLM model, so these are paid models. You can get the `REPLICATE_API_TOKEN` by registering on [their website](https://replicate.com/account).
-- `Llama2App` uses OpenAI's embedding model to create embeddings for chunks. Make sure that you have an OpenAI account and an API key. If you have don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
+- `Llama2App` uses OpenAI's embedding model to create embeddings for chunks. Make sure that you have an OpenAI account and an API key. If you don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
 
 
 ### OpenSourceApp
@@ -103,7 +103,7 @@ naval_chat_bot = PersonApp("name_of_person_or_character") #Like "Yoda"
 ```
 
 - `PersonApp` uses OpenAI's model, so these are paid models. 💸 You will be charged for embedding model usage and LLM usage.
-- `PersonApp` uses OpenAI's embedding model to create embeddings for chunks and ChatGPT API as LLM to get answer given the relevant docs. Make sure that you have an OpenAI account and an API key. If you have don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
+- `PersonApp` uses OpenAI's embedding model to create embeddings for chunks and ChatGPT API as LLM to get answer given the relevant docs. Make sure that you have an OpenAI account and an API key. If you don't have an API key, you can create one by visiting [this link](https://platform.openai.com/account/api-keys).
 - Once you have the API key, set it in an environment variable called `OPENAI_API_KEY`
 
 ```python

+ 2 - 2
docs/advanced/data_types.mdx

@@ -19,7 +19,7 @@ Otherwise, you will not know when, for instance, an invalid filepath is interpre
 To omit any issues with the data type detection, you can **force** a data_type by adding it as a `add` method argument.
 The examples below show you the keyword to force the respective `data_type`.
 
-Forcing can also be used for edge cases, such as interpreting a sitemap as a web_page, for reading it's raw text instead of following links.
+Forcing can also be used for edge cases, such as interpreting a sitemap as a web_page, for reading its raw text instead of following links.
 
 ## Remote Data Types
 
@@ -138,4 +138,4 @@ print(naval_chat_bot.query("What unique capacity does Naval argue humans possess
 
 ## More formats (coming soon!)
 
-- If you want to add any other format, please create an [issue](https://github.com/embedchain/embedchain/issues) and we will add it to the list of supported formats.
+- If you want to add any other format, please create an [issue](https://github.com/embedchain/embedchain/issues) and we will add it to the list of supported formats.

+ 2 - 2
docs/advanced/interface_types.mdx

@@ -19,7 +19,7 @@ print(naval_chat_bot.query("What unique capacity does Naval argue humans possess
 
 ### Chat Interface
 
-- This interface is chat interface where it remembers previous conversation. Right now it remembers 5 conversation by default. 💬
+- This interface is a chat interface that remembers previous conversations. Right now it remembers 5 conversations by default. 💬
 
 - To use this, call `.chat` function to get the answer for any query.
 
@@ -72,4 +72,4 @@ Counts the number of embeddings (chunks) in the database.
 ```python
 print(app.count())
 # returns: 481
-```
+```

+ 1 - 1
docs/advanced/vector_database.mdx

@@ -31,4 +31,4 @@ config = CustomAppConfig(
 es_app = CustomApp(config)
 ```
 - Set `db_type=VectorDatabases.ELASTICSEARCH` and `es_config=ElasticsearchDBConfig(es_url='')` in `CustomAppConfig`.
-- `ElasticsearchDBConfig` accepts `es_url` as elasticsearch url or as list of nodes url with different hosts and ports. Additionally we can pass named paramaters supported by Python Elasticsearch client.
+- `ElasticsearchDBConfig` accepts `es_url` as elasticsearch url or as list of nodes url with different hosts and ports. Additionally we can pass named parameters supported by Python Elasticsearch client.