|
@@ -1,14 +1,31 @@
|
|
|
---
|
|
|
-title: '📰 PDF file'
|
|
|
+title: '📰 PDF'
|
|
|
---
|
|
|
|
|
|
-To add any pdf file, use the data_type as `pdf_file`. Eg:
|
|
|
+You can load any pdf file from your local file system or through a URL.
|
|
|
+
|
|
|
+## Setup
|
|
|
+Install the following packages for loading youtube videos which help in transcription.
|
|
|
+
|
|
|
+```bash
|
|
|
+pip install pytube youtube-transcript-api
|
|
|
+```
|
|
|
+
|
|
|
+## Usage
|
|
|
+
|
|
|
+### Load from a local file
|
|
|
|
|
|
```python
|
|
|
from embedchain import App
|
|
|
-
|
|
|
app = App()
|
|
|
+app.add('/path/to/file.pdf', data_type='pdf_file')
|
|
|
+```
|
|
|
+
|
|
|
+### Load from URL
|
|
|
|
|
|
+```python
|
|
|
+from embedchain import App
|
|
|
+app = App()
|
|
|
app.add('https://arxiv.org/pdf/1706.03762.pdf', data_type='pdf_file')
|
|
|
app.query("What is the paper 'attention is all you need' about?", citations=True)
|
|
|
# Answer: The paper "Attention Is All You Need" proposes a new network architecture called the Transformer, which is based solely on attention mechanisms. It suggests that complex recurrent or convolutional neural networks can be replaced with a simpler architecture that connects the encoder and decoder through attention. The paper discusses how this approach can improve sequence transduction models, such as neural machine translation.
|
|
@@ -23,25 +40,11 @@ app.query("What is the paper 'attention is all you need' about?", citations=True
|
|
|
# ...
|
|
|
# }
|
|
|
# ),
|
|
|
-# (
|
|
|
-# 'Attention Visualizations Input ...',
|
|
|
-# {
|
|
|
-# 'page': 12,
|
|
|
-# 'url': 'https://arxiv.org/pdf/1706.03762.pdf',
|
|
|
-# 'score': 0.41679039679873736,
|
|
|
-# ...
|
|
|
-# }
|
|
|
-# ),
|
|
|
-# (
|
|
|
-# 'sequence learning ...',
|
|
|
-# {
|
|
|
-# 'page': 10,
|
|
|
-# 'url': 'https://arxiv.org/pdf/1706.03762.pdf',
|
|
|
-# 'score': 0.4188303600897153,
|
|
|
-# ...
|
|
|
-# }
|
|
|
-# )
|
|
|
# ]
|
|
|
```
|
|
|
|
|
|
-Note that we do not support password protected pdfs.
|
|
|
+We also store the page number under the key `page` with each chunk that helps understand where the answer is coming from. You can fetch the `page` key while during retrieval (refer to the example given above).
|
|
|
+
|
|
|
+<Note>
|
|
|
+Note that we do not support password protected pdf files.
|
|
|
+</Note>
|