You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To perform question answering task on pdf file, we need to upload the file to web server. After this web server can prompt LLM model to using pdf file as context.
Discussion needed
Will we store uploaded file locally on the web server or store it in a cloud application such as AWS S3.
Do we need to cache the chunks created using the content of uploaded files
Acceptance Criteria
User is able to upload pdf file to server and server breaks the content into chunks which can be passed to LLM for inference
User is able to perform question answering task on uploaded file.
Description
To perform question answering task on pdf file, we need to upload the file to web server. After this web server can prompt LLM model to using pdf file as context.
Discussion needed
Acceptance Criteria