Skip to content

YellowSakura/aimailsupport

Repository files navigation

AI Mail Support for Thunderbird

License: MIT TypeScript Thunderbird

Thunderbird add-on designed to enhance both professional and personal email management.
This add-on integrates a range of AI (LLM) features to streamline your inbox experience.

Our aim is to assist users dealing with high volumes of daily emails, providing tools for tasks like summarizing messages, translating content, offering structured support for composing responses, and much more.

Contents

  1. Getting started
  2. Build
  3. Permissions details
  4. Localization
  5. License and references

Getting started

Have you ever had an inbox full of hundreds of unread emails that you need to respond to?
We have, and more than once.

That's why we decided to create this add-on for Thunderbird to help manage the multitude of emails we read daily as part of our work activities.

Several LLMs (Large Language Models) are integrated to provide a range of options for advanced text management, operating at the deepest possible semantic level, to optimize the management of your email inbox.
The LLMs* currently supported are:

It is possible to access a wider set of models (e.g., Llama, Phi, Mistral, Gemma, and many others) through the use of:

* To use them, it is necessary to create an account on the respective platforms and enable an API access key. Usage fees apply; for more details, please refer to the respective websites.

ATTENTION 1: The services offered by Groq Cloud and Mistral AI include the option to use a free plan, albeit with low rate limits on requests.
ATTENTION 2: Unlike other LLM models, LM Studio and Ollama allow you to run open-source models directly on your own PC, with no additional costs and maximum privacy, as everything is executed locally.
The downside is that this requires SIGNIFICANT hardware resources.

Settings and usage

After installing the add-on, you can configure the desired LLM service provider from the add-on's settings.

You can access the settings by going to Tools → Add-ons and Themes, and then selecting the wrench icon next to AI Mail Support.

Add-on preferences

Based on the LLM choice, additional specific options will become available. For example, below is the screenshot of all possible configurations when OpenAI is selected as the provider.

Add-on preferences for OpenAI

Typically, an authentication key needs to be configured; the specific method depends on the LLM provider.
In the options, there will be a quick link to the official website with useful details.

Once the add-on is configured, you can interact with the AI management features within Thunderbird in three different locations:

  1. In the email view, via the "AI support" menu:

AI support integration in email view

  1. In the email composition or editing window, by selecting "AI support" in the top right:

AI support integration in email composition or editing window

  1. By selecting any text in either the email viewing or composition window, in the "AI Mail Support" section of the context menu:

AI support integration in selected text

Regardless of how a request for processing is made, the output (audio or text) will be displayed in a dedicated pop-up at the bottom of the mail client.

Output

Whether in the email viewing or composition window, you can always enable the custom prompt feature to receive even more relevant responses tailored to your needs.

Custom prompt

Owl for Exchange bug

If you use the Owl for Exchange add-on to manage Exchange or Office365 accounts, ⚠️ there is a known bug that interferes with the scripting.messageDisplay API and will prevent AI Mail Support for Thunderbird from functioning correctly when previewing an email.

Build

Run the following to build the add-on directly from the source code:

$ git clone https://github.com/YellowSakura/aimailsupport.git
$ cd aimailsupport
$ npm install

To compile a development version of the add-on and install it in Thunderbird via Tools → Developer Tools → Debug Add-ons → Load Temporary Add-on…, use the following command:

$ npm run build

To generate a file named ai-mail-support.xpi in the project's root folder, as a package ready for installation as an add-on in Thunderbird, use the following command:

$ npm run build:package

To assess the overall quality of the code, you can use the following command:

$ npm run lint

It is possible to run unit tests using the command:

$ npm run test

You can run a specific group of tests for a single provider using the command:

$ npm run test:single "AnthropicClaudeProvider"

Before running any tests, you need to create an .env file in the project root directory with the keys for the various LLM services in the following format:

anthropic_api_key = KEY_VALUE
deepseek_api_key = KEY_VALUE
google_api_key = KEY_VALUE
groq_api_key = KEY_VALUE
mistral_api_key = KEY_VALUE
openai_api_key = KEY_VALUE
xai_api_key = KEY_VALUE

To test LM Studio, it is necessary to install the model llama-3.2-1b from the GUI or using the command:

$ lms get llama-3.2-1b

To test Ollama, it is necessary to install the model llama3.2:1b using the command:

$ ollama pull llama3.2:1b

Permissions details

AI Mail Support for Thunderbird aims to make use of a minimal set of permissions for its operation, specifically:

Localization

The add-on has a small set of messages that require localization. If you want to extend the translation, the process is straightforward:

  1. Copy the file src/locales/en-messages.json to src/locales/%ISO_CODE%-messages.json, where %ISO_CODE% is your ISO 639-1 language code.

  2. Translate your src/locales/%ISO_CODE%-messages.json, specifically the message properties, and remove the description properties, which are only used for context.

  3. Add a new line in the package.json file, matching the other build:locales-* entries and keeping them in alphabetical order:

    "build:locales-%ISO_CODE%": "node_modules/.bin/json-minify src/locales/%ISO_CODE%-messages.json > ai-mail-support/_locales/%ISO_CODE%/messages.json",
  4. Add a corresponding entry to the build:locales script in package.json, again maintaining alphabetical order.

  5. Add the folder %ISO_CODE% to the _locales key in your src/manifest.json.

  6. Test your changes using the build process described in the Build section and submit the changes in a pull request.

License and references

The code is licensed under the MIT by Yellow Sakura, support@yellowsakura.com, see the LICENSE file.
For more details, please refer to the project page and the link to the official AMO (addons.mozilla.org) page.

Dependencies:

Images:


All trademarks mentioned are the property of their respective owners.

About

The extension optimizes the experience of using your inbox, both personally and professionally, through AI (LLM) support

Resources

License

Stars

Watchers

Forks

Packages