This is a LlamaIndex project bootstrapped with create-llama.
First, startup the backend as described in the backend README.
Second, run the development server of the frontend as described in the frontend README.
Open http://localhost:3000 with your browser to see the result.
You may need to run az login before starting your backend in order to get authorized with your azure container session pool.
Make sure you assign yourself the Contributor and Session Pool Executor RBAC roles on Azure IAM first.
To learn more about LlamaIndex, take a look at the following resources:
- LlamaIndex Documentation - learn about LlamaIndex (Python features).
- LlamaIndexTS Documentation - learn about LlamaIndex (Typescript features).
You can check out the LlamaIndexTS GitHub repository - your feedback and contributions are welcome!