Automatic suggestions for an operators. AI powered by DeepPavlov.
- First you have to install extension itself
- After you have installed extension and added few most common questions. You can proceed with DeepPavlov training.
After you have cloned repository you can copy extension/lhcchatbot to extensions folders of Live Helper chat
so it should look like lhc_web\extension/lhcchatbot
You can either run this file SQL directly or run this command
php cron.php -s site_admin -e lhcchatbot -c cron/update_structureextension/lhcchatbot/settings/settings.ini.default.php to extension/lhcchatbot/settings/settings.ini.php
Edit main application settigns.gile
'extensions' =>
array (
'lhcchatbot'
),- First you have to create at-least one
Context(Modules -> Reply predictions) - Edit department and choose your newly created
Contextto be server for edited department. - Adding questions
- Questions can be added by selecting visitor message with mouse direclty in the chat and clicking plus icon.
- Questions can also be added from left menu
Modules -> Reply predictions
- After you have added few questions you can run this command
/usr/bin/php cron.php -s site_admin -e lhcchatbot -c cron/deeppavlov_trainAfter that you should see csv file (train_1.csv most likely if you have one context) in extension/lhcchatbot/train folder.
Copy those files to deeppavlov/Dockerfiles/deep/train folder of cloned repository.
MeiliSearch allows instant auto completion suggestions based on chats history and canned messages.
Navigate to deeppavlov and copy .env.default to .env
Edit .env file LHC_MEILI_SEARCH_MASTER_KEY value and set your own master key value.
Start one time
docker-compose -f docker-meilisearch-compose.yml upStart as a service
docker-compose -f docker-meilisearch-compose.yml up -dIf you are planning to update constantly auto completion data it makes sense to run this command once a week.
/usr/bin/php cron.php -s site_admin -e lhcchatbot -c cron/auto_completeAfter above command is execute you will see in extension/lhcchatbot/train folder autocomplete_hash_<dep_id>.json and autocomplete_text_<dep_id>.json files. If you wish you can always adjust file manually or just modify script itself.
Now run in shell. It will feed auto complete data to MeiliSearch. It will print also Public Key
cd extension/lhcchatbot && ./doc/update_autocomplete.sh "http://localhost:7700/" <master_key>In Reply Predictions module you will find menu item called Auto complete and set Public key. Public key you will get from above command. Auto completion has to be enabled per department. Edit department and enable it in Reply Predictions tab.
location /msearch/ {
proxy_pass http://127.0.0.1:7700/;
}- Start typing your regular sentences, and you will see possible sentence endings at the bottom.
- To replace all what you typed you can use
#<your search query>also - In messages you can also use placeholders
{nick},{operator},{year},{month},{demail},{email}just start typing any of these keywords.
Navigate to deeppavlov and copy .env.default to .env
Training is always happening on a docker image startup.
There is a two ways DeepPavlov can work. Wither with spellchecker or without.
# Optional to build an image
# docker-compose -f docker-do-compose.yml build
# Train and run image
docker-compose -f docker-dp-compose.yml upRun as service once it's build.
docker-compose -f docker-dp-compose.yml up -dTo test does it works you can use CURL command
curl -X POST "http://localhost:5000/model" -H "accept: application/json" -H "Content-Type: application/json" -d "{\"q\":[\"hi\"]}"To rebuild image
docker-compose -f docker-dp-compose.yml build --no-cacheTo restart with forcing to recreate it
docker-compose -f docker-dp-compose.yml up --force-recreate
With spellchecker visitor messages before running against your questions will be checked against spelling errors.
Spellchecker requires these changes.
- Edit
.envfile and changeLHC_API=train_tfidf_logreg_en_faq.jsontoLHC_API=riseapi.json - Navigate to
deeppavlov/Dockerfiles/deep/data/downloads/language_modelsand see README.md file content. You will need to download file which is 6GB file size!
The Easiest way is just to have some shell which would run daily something like that. This is just an example adopt it to your needs.
# Export trainings Adjust paths!
cd `lhc_web/` && /usr/bin/php cron.php -s site_admin -e lhcchatbot -c cron/deeppavlov_train
# Copy trainings. Adjust paths!
cd ../ && cp extension/lhcchatbot/train/* /deeppavlov/Dockerfiles/deep/train
# Restart docker image
docker-compose restart deeppavlov-lhcchatbot The easiest way is just to modify docker-compose.yml file and add more than one service with different configuration
Possible workflow
- You should modify
portssections ofdocker-compose.yml - Create a copy of
deeppavlov/Dockerfiles/deep/datalikedeeppavlov/Dockerfiles/deep/data_2 - Modify
volumes:section- ./Dockerfiles/deep/data:/base/deepto something like- ./Dockerfiles/deep/data_2:/base/deep - Modify
volumes:section- "./Dockerfiles/deep/train/${LHC_TRAIN_FILE}:/base/train/train.csv"to something like- "./Dockerfiles/deep/train/train_2.csv:/base/train/train.csv" - Modify
container_namefromdeeppavlov-lhcchatbottodeeppavlov-lhcchatbot-germanas an example - Modify
- LHC_API=${LHC_API}if you are using spellchecker. As most likelu it will not setup for other langauge than english. Put theretrain_tfidf_logreg_en_faq.json
After that don't forget to modify your new context and set host to new url with a new port.
Examples configuration
deeppavlov-lhcchatbot-german:
build: ./Dockerfiles/deep
environment:
- LHC_API=train_tfidf_logreg_en_faq.json
container_name: deeppavlov-lhcchatbot-german
image: remdex/deeppavlov-lhcchatbot:latest
ports:
- "5005:5000"
volumes:
- ./Dockerfiles/deep/data_2:/base/deep
- ./Dockerfiles/deep/config:/base/config
- "./Dockerfiles/deep/train/train_9.csv:/base/train/train.csv"
networks:
- code-network
restart: always
