-
Notifications
You must be signed in to change notification settings - Fork 50
Support for PyPy processes #38
Description
It could be interesting to offload the parallel jobs to PyPy processes instead of the main Python interpreter, which would be mostly CPython if one wants to plot data, use Pandas or any other library that is not (yet) supported by PyPy. In this use case I see PyPy as a fast computation backend similar to Numba and its @jit decorator. Furthermore, Numpy arrays could be passed in parameters and handled by Numpypy, provided that pickling/unpickling is compatible between the two.
For it to work it is necessary to use multiprocessing.set_start_method('spawn') and set_executable() to set the interpreter to PyPy. Unfortunately this option of starting a fresh Python interpreter process is only available from Python 3.4, and PyPy only supports 3.3 for now. There is also this multiprocess fork of multiprocessing, which uses dill for a better serialization, so it could be worth integrating the py3.5 version of multiprocess into deco, so that set_start_method can be back-ported to an older version of Python and available in PyPy.
What do you think?