-
Notifications
You must be signed in to change notification settings - Fork 121
Description
Per https://devguide.python.org/versions/ Python 3.9 goes EOL this October (2025).
Maintaining the bindings for 3.9 is not overly difficult at this point since we don't do anything super advanced in the python layer around the C library.
The biggest thing is probably cleaner typing syntax... I don't think there's anything core we're going to benefit from off the top of my head.
What I don't have a good sense of is how to balance the non-EOL Python versions vs distro support vs our published support range. RHEL 9 ships 3.9 but isn't EOL until 2027. By then, python 3.16 will be out.
Dropping older Python just means newer versions of python can use newer binding releases. Older python versions can still use any published package that supports its version, but if there are fixes in newer bindings, the older packages will not receive them.
Some distros (like RHEL) provide alternate system packages with newer python. Otherwise, tools like uv or pyenv help with deploying new versions of python onto systems shipped with a "stale" version, however, some users may be locked into a specific version for legacy/corporate purposes, meaning when a dependency stops being supported for their version, a lot of work may need to go into updating the entire codebase and re-validating on a new python version.
The hurdle to supporting the larger version range is simply testing. There is no CI driven automated testing that I see to make sure all tests run across all supported python versions. I've tested some of my patches by hand on 3.9 and just assumed if it works there, it will work on future versions, but I haven't taken the time to do this for every patch for all versions.
I can maybe make a CI test suite on my fork that can make this a little less arduous and take the guess work out of if everything works like we expect.
Regardless, we probably want to establish policies about when we cut off old versions of Python from the bindings.