I noticed that gpt-4o uses (o200k_base) a different tokenizer than gpt-4 (cl100k_base). It looks like this was added openai/tiktoken@9d01e56#diff-0d973848bd229418209db2c46c86167000845592ca6b98fad215c21c317bc494R10 in tiktoken version 0.7.0
It would be useful if ttok could also be updated to use it