B17 is an efficient command line tool crafted for batching translation jobs. It harnesses the power of several concurrent features offered by Modal, such as modal.Dict, modal.NetworkFileSystem and map, which allow us to initiate multiple OpenAI completion requests in parallel and thus perform numerous translation jobs at the same time.
usage: local.py [-h] [-t source_file] [-c source_file custom_prompt_file] [-r job_id [chapter_indexes ...]]
A command-line helper for batching translation jobs.
options:
-h, --help show this help message and exit
-t source_file, --translate source_file
Translate the content of a local file. Usage: -t [source_file]
-c source_file custom_prompt_file, --custom source_file custom_prompt_file
Use your own instruction for the translation. Usage: -c [source_file] [custom_prompt_file]
-r job_id [chapter_indexes ...], --redo job_id [chapter_indexes ...]
Get new translations based on an ID and chapter indices. Usage: -r [job_id] [chapter_index_1]
[chapter_index_2] ...
|<--- remote.py running on Modal ----------------->|
+----------+ +-----------+ +-----------------------+
| | call | | call | |
| User |--------->| translate |-------------> | translate_msg_wrapper |
| local.py |<---------| |<------------- | (concurrent for |
| | return | | return | all chapters) |
+----------+ +-----------+ | |
^ | | |
(return | | (If translations | |
updated | | don't meet | |
results) | | requirements) | |
| | | |
| V | |
+------------------+ call | |
| |----------> | |
| last_shot | | |
| (concurrent for |<---------- | translate_msg_wrapper |
| all unqualified | return | (10 concurrent |
| chapters) | | calls for each |
+------------------+ | last_shot) |
+-----------------------+
- install openai module with
pip
pip install openai- install and setup
modal
pip install modal
modal setup- deploy
remote.pyto modal
modal deploy remote- run the local script with python
python local.py -t source_file.txt- you will see a
result.txtfile once the translation is done