Skip to content

Fix test execution under Python 3.14#459

Open
SuperSandro2000 wants to merge 2 commits intoNabuCasa:mainfrom
SuperSandro2000:tests-3.14
Open

Fix test execution under Python 3.14#459
SuperSandro2000 wants to merge 2 commits intoNabuCasa:mainfrom
SuperSandro2000:tests-3.14

Conversation

@SuperSandro2000
Copy link
Copy Markdown

see the Python 3.14 changelog for the breaking change to multiprocessing fork on linux per default
https://docs.python.org/3/whatsnew/3.14.html#:~:text=On%20Unix%20platforms%20other%20than%20macOS%2C%20%E2%80%98forkserver%E2%80%99%20is%20now%20the%20default%20start%20method%20for%20ProcessPoolExecutor%20(replacing%20%E2%80%98fork%E2%80%99).%20This%20change%20does%20not%20affect%20Windows%20or%20macOS%2C%20where%20%E2%80%98spawn%E2%80%99%20remains%20the%20default%20start%20method

Multiple tests failed with similar errors like:

_____________________________ test_peer_connection _____________________________ [gw0] linux -- Python 3.14.3 /nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/bin/python3.14

test_server_sync = [<socket.socket fd=41, family=2, type=1, proto=0, laddr=('127.0.0.1', 8366), raddr=('127.0.0.1', 35672)>] test_client_sync = <socket.socket fd=33, family=2, type=1, proto=0, laddr=('127.0.0.1', 35672), raddr=('127.0.0.1', 8366)> event_loop = <_UnixSelectorEventLoop running=False closed=False debug=False>

    def test_peer_connection(
        test_server_sync: list[socket.socket],
        test_client_sync: socket.socket,
        event_loop: asyncio.AbstractEventLoop,
    ) -> None:
        """Run a full flow of with a peer."""
        worker = ServerWorker(FERNET_TOKENS)
        valid = datetime.now(tz=UTC) + timedelta(days=1)
        aes_key = os.urandom(32)
        aes_iv = os.urandom(16)
        hostname = "localhost"
        fernet_token = create_peer_config(valid.timestamp(), hostname, aes_key, aes_iv)

>       worker.start()

tests/server/test_worker.py:43:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/process.py:121: in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/context.py:224: in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/context.py:300: in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
    super().__init__(process_obj)
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/popen_fork.py:20: in __init__
    self._launch(process_obj)
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
    reduction.dump(process_obj, buf)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

obj = <ServerWorker name='ServerWorker-21' parent=272 initial> file = <_io.BytesIO object at 0x7fffef185c10>, protocol = None

    def dump(obj, file, protocol=None):
        '''Replacement for pickle.dump() using ForkingPickler.'''
>       ForkingPickler(file, protocol).dump(obj)
E       TypeError: cannot pickle 'weakref.ReferenceType' object
E       when serializing dict item '_weakref'
E       when serializing multiprocessing.util.Finalize state
E       when serializing multiprocessing.util.Finalize object
E       when serializing dict item 'finalizer'
E       when serializing multiprocessing.popen_forkserver.Popen state
E       when serializing multiprocessing.popen_forkserver.Popen object
E       when serializing dict item '_popen'
E       when serializing multiprocessing.context.ForkServerProcess state
E       when serializing multiprocessing.context.ForkServerProcess object
E       when serializing dict item '_process'
E       when serializing multiprocessing.managers.SyncManager state
E       when serializing multiprocessing.managers.SyncManager object
E       when serializing dict item '_manager'
E       when serializing snitun.server.worker.ServerWorker state
E       when serializing snitun.server.worker.ServerWorker object

/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/reduction.py:60: TypeError ---------------------------- Captured stderr setup ----------------------------- DEBUG:asyncio:Using selector: EpollSelector
------------------------------ Captured log setup ------------------------------
DEBUG    asyncio:selector_events.py:64 Using selector: EpollSelector

see the Python 3.14 changelog for the breaking change to multiprocessing
fork on linux per default
https://docs.python.org/3/whatsnew/3.14.html#:~:text=On%20Unix%20platforms%20other%20than%20macOS%2C%20%E2%80%98forkserver%E2%80%99%20is%20now%20the%20default%20start%20method%20for%20ProcessPoolExecutor%20(replacing%20%E2%80%98fork%E2%80%99).%20This%20change%20does%20not%20affect%20Windows%20or%20macOS%2C%20where%20%E2%80%98spawn%E2%80%99%20remains%20the%20default%20start%20method

Multiple tests failed with similar errors like:

_____________________________ test_peer_connection _____________________________
[gw0] linux -- Python 3.14.3 /nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/bin/python3.14

test_server_sync = [<socket.socket fd=41, family=2, type=1, proto=0, laddr=('127.0.0.1', 8366), raddr=('127.0.0.1', 35672)>]
test_client_sync = <socket.socket fd=33, family=2, type=1, proto=0, laddr=('127.0.0.1', 35672), raddr=('127.0.0.1', 8366)>
event_loop = <_UnixSelectorEventLoop running=False closed=False debug=False>

    def test_peer_connection(
        test_server_sync: list[socket.socket],
        test_client_sync: socket.socket,
        event_loop: asyncio.AbstractEventLoop,
    ) -> None:
        """Run a full flow of with a peer."""
        worker = ServerWorker(FERNET_TOKENS)
        valid = datetime.now(tz=UTC) + timedelta(days=1)
        aes_key = os.urandom(32)
        aes_iv = os.urandom(16)
        hostname = "localhost"
        fernet_token = create_peer_config(valid.timestamp(), hostname, aes_key, aes_iv)

>       worker.start()

tests/server/test_worker.py:43:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/process.py:121: in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/context.py:224: in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/context.py:300: in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
    super().__init__(process_obj)
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/popen_fork.py:20: in __init__
    self._launch(process_obj)
/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
    reduction.dump(process_obj, buf)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

obj = <ServerWorker name='ServerWorker-21' parent=272 initial>
file = <_io.BytesIO object at 0x7fffef185c10>, protocol = None

    def dump(obj, file, protocol=None):
        '''Replacement for pickle.dump() using ForkingPickler.'''
>       ForkingPickler(file, protocol).dump(obj)
E       TypeError: cannot pickle 'weakref.ReferenceType' object
E       when serializing dict item '_weakref'
E       when serializing multiprocessing.util.Finalize state
E       when serializing multiprocessing.util.Finalize object
E       when serializing dict item 'finalizer'
E       when serializing multiprocessing.popen_forkserver.Popen state
E       when serializing multiprocessing.popen_forkserver.Popen object
E       when serializing dict item '_popen'
E       when serializing multiprocessing.context.ForkServerProcess state
E       when serializing multiprocessing.context.ForkServerProcess object
E       when serializing dict item '_process'
E       when serializing multiprocessing.managers.SyncManager state
E       when serializing multiprocessing.managers.SyncManager object
E       when serializing dict item '_manager'
E       when serializing snitun.server.worker.ServerWorker state
E       when serializing snitun.server.worker.ServerWorker object

/nix/store/8gnchv834z56s561v3sx2h0ra1a2xn46-python3-3.14.3/lib/python3.14/multiprocessing/reduction.py:60: TypeError
---------------------------- Captured stderr setup -----------------------------
DEBUG:asyncio:Using selector: EpollSelector
------------------------------ Captured log setup ------------------------------
DEBUG    asyncio:selector_events.py:64 Using selector: EpollSelector
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant