Welcome to the AssessBias's documentation. This is to help you reproduce our work from
- Algorithmic fairness: Not a purely technical but socio-technical property
- Do existing fairness measures suffice? Assessing discrimination in algorithmic decision-making
We developed AssessBias primarily with Python 3.8, but plotted figures with Python 3.11. Remember to install anaconda/miniconda first, and then choose the corresponding environment.
$ # To obtain the empirical data
$ conda create -n py38 python=3.8
$ source activate py38
$ pip install --upgrade pip
$ pip install -r reqs_py38.txt
$ # conda deactivate
$ # conda remove -n py38 --all$ # To obtain the plotted figures
$ conda create -n py311 python=3.11
$ source activate py311
$ pip install -U pip
$ pip install -r requirements.txt
$ # conda deactivate
$ # conda remove -n py311 --allWe borrow some auxiliary functions from PyFairness, and to use it, please do the following.
$ # Two ways to install (& uninstall) PyFairness
$ git clone git@github.com:eustomaqua/PyFairness.git
$
$ pip install -r PyFairness/reqs_py311.txt
$ pip install -e ./PyFairness
$ # pip uninstall pyfair
$
$ cp -r ./PyFairness/pyfair ./
$ cp -r ./PyFairness/data ./
$ # rm -r pyfair data
$ yes | rm -r PyFairnessTo reproduce our empirical results, you may use the released data and do as follows.
$ # source activate py311
$ python fair_int_draw.py -exp KF_exp1b
$ # conda deactivateYou're welcome to contact us or raise an issue if you find any typos or errors.
AssessBias is released under the MIT Licence.