A serverless AI Slack bot service using Embedchain deployed to AWS Lambda using Pulumi.
Load and interrogate your data using an Artificial Intelligence RAG microservice built on Embedchain, providing CLI, REST API and Slack interfaces, with an option to deploy to AWS Lambda using Pulumi.
Optional configuration for various data sources, LLMs, vector databases, embedding models, and evaluation.
- Docker
- Python 3
cp -R /path/to/my/assets ./assets
cp sample.env .envEnsure you have required dependencies installed and have followed the prerequisite steps.
Populate the .env file with, at minimum, the OPENAI_API_KEY is required for arti to answer questions.
docker compose run --rm arti ask "What do these files contain?"python3 -m venv venv
. venv/bin/activate
make install
arti ask "What do these files contain?"Configuration is found in environment variables, documented in the table below.
- Docker
- Python 3
-
Copy
sample.envto.env, then edit.envto replace the sample with your desired settings:cp sample.env .env
Variable Description AWS_REGIONAWS region OPENAI_API_KEYOpenAI Key (takes precedence over secret) OPENAI_API_KEY_SECRET_NAMEOpenAI Key secret name PINECONE_API_KEYPinecone Key secret (takes precedence over secret) PINECONE_API_KEY_SECRET_NAMEPinecone Key secret name SLACK_BOT_TOKENSlack bot token (takes precedence over secret) SLACK_BOT_TOKEN_SECRET_NAMESlack bot token secret name SLACK_BOT_SIGNING_SECRETSlack bot signing secret LOG_LEVELLog level -
Create a Python virtual environment and activate it (first run only)
make venv . venv/bin/activate -
Configure the project for development and install dependencies
make develop
-
Populate the dataset in
./assetswith files such as PDFs, Docx, CSV, HTML, Text and more -
Run the application
arti arti ask "what can you tell me about fruits and vegetables?" # alternatively, run everything using docker compose docker compose run --rm arti ask "what can you tell me about fruits and vegetables?"
-
To stop, [CTRL]-C the application
This project implements pre-commit to manage git hooks. This isn't required, but it will help you catch any issues before you push your commits.
Install pre-commit on MacOS using Homebrew:
brew install pre-commitOnce you have pre-commit installed, install the git hook scripts:
pre-commit installThe slack integration uses Bolt.
Follow their instructions to create a new Slack app.
For local usage, populate your .env file with the appropriate Slack tokens.
For reference, these are approximately the expected scopes, depending on your use of the bot.
chat:writechannels:readcommandsim:readim:writeusers:readusers:write
message.channelsmessage.im
A cli entrypoint is added as an example. By default, the assets directory will be loaded into the vector database for search.
Say for example, I had a document containing information about fruits and vegetables:
arti ask "what can you tell me about fruits and vegetables?"
arti -h
A Makefile is provided to ease some common tasks, such as linting and deploying.
To see usage instructions:
make helpDeployment happens through the Makefile for convenience. The stack configuration can be found in deploy/pulumi.
Create the following keys in Secrets Manager. These names are configurable in deploy/pulumi/Pulumi.<stack>.yaml. Slack and Pinecone are only necessary if they are configured for use.
| Secret Name | Schema |
|---|---|
| /catmeme/cloud-platform/sandbox/arti/access-token/openai | { "apiKey": "" } |
| /catmeme/cloud-platform/sandbox/arti/access-token/pinecone | { "apiKey": "" } |
| /catmeme/cloud-platform/sandbox/arti/access-token/slack | { "apiKey": "", "signingSecret": "" } |
Ensure you have a matching AWS profile name to the one in the stack. The Makefile assumes <environment>-deployment
make deploy DEPLOY_ENVIRONMENT=devThe above example would expect a dev-deployment AWS profile configured.




