Skip to content

Geambrosio/sdg_visualization_tool

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SDG Tool

Quick Start - Recommended

Installation and execution

  • Clone the repository and access the folder:

    git clone https://github.com/Geambrosio/sdg_visualization_tool cd sdg_visualization_tool

  • Create the environment (based on pyproject.toml):

    conda create --name sdg_tool_env

  • Activate the environment

    conda activate sdg_tool_env

  • Install the application in the environment (you might need to authenticate your GitHub account)

    conda install git pip pip install git+https://github.com/Geambrosio/sdg_visualization_tool.git

  • Run the application to calculate SDG indicators (EXAMPLE):

    sdg -i input/SSPs/SSP1.xlsx -i input/SSPs/SSP2.xlsx -o SSPs -p SSPs

  • Install Jupyter Lab to access SDG tool functionalities from docs/example.ipynb

    conda install -c conda-forge jupyterlab jupyter lab

Quick Start - Alternatives

Installation

The project is not yet deployed to PyPI, however, it can be installed as a Python package from the source on a local folder or from the git repository by any tool that supports it. For instance:

  • Using uv:

    uv tool install git+https://github.com/Geambrosio/sdg_visualization_tool.git
  • Using pipx:

    pipx install git+https://github.com/Geambrosio/sdg_visualization_tool.git

pip install also works, just pay attention to the environment you are installing it. It is not considered a good practice to install packages globally. You can also pick your favorite virtual environment manager:

  • Conda:

    conda create --name <my-env>
    source activate <my-env>
    conda install git pip
    pip install git+https://github.com/Geambrosio/sdg_visualization_tool.git
  • venv:

    python -m venv <my-env>
    source <my-env>/bin/activate
    pip install git+https://github.com/Geambrosio/sdg_visualization_tool.git
  • uv:

    uv venv <my-env>
    source <my-env>/bin/activate
    uv pip install git+https://github.com/Geambrosio/sdg_visualization_tool.git

Usage

$ sdg -h
A tool to calculate SDG indicators

Options
=======
The options below are convenience aliases to configurable class-options,
as listed in the "Equivalent to" description-line of the aliases.
To see all configurable class-options for some <cmd>, use:
    <cmd> --help-all

--debug
    Set log-level to debug, for the most verbose logging.
    Equivalent to: [--Application.log_level=10]
--show-config
    Show the application's configuration (human-readable format)
    Equivalent to: [--Application.show_config=True]
--show-config-json
    Show the application's configuration (json format)
    Equivalent to: [--Application.show_config_json=True]
-i=<list-item-1>...
    List of input files (.xlsx)
    Default: []
    Equivalent to: [--App.input_files]
-o=<TraitPath>
    Prefix for the output files (.indicators.csv, .targets.csv)
    Default: PosixPath('sdg_result')
    Equivalent to: [--App.output_file_prefix]
-p=<Unicode>
    Name of the project
    Default: 'SDG Project'
    Equivalent to: [--App.project_name]

To see all available configurables, use `--help-all`.

How to contribute

Dependencies

  1. This project is designed to work with Python 3.9 and later versions.

  2. The project uses Hatch as a build system and package manager. Refer to the Hatch install to check how to install it on your system.

    • Hatch specific configuration covering the development dependencies, environments, and maintenance scripts are defined on the file pyproject.toml. You can refer to Environment configuration for more details.

    • It is recommended to keep the Python environments within the project we are working on, Hatch can be set to do so:

      hatch config set dirs.env.virtual .venv
    • You can check them anytime by running:

    $ hatch env show --ascii
                                              Standalone
    +---------+---------+-----------------------+----------------------+------------------------------+
    | Name    | Type    | Dependencies          | Scripts              | Description                  |
    +=========+=========+=======================+======================+==============================+
    | default | virtual | coverage[toml]>=7.5.3 | check                | Base development environment |
    |         |         | jupyterlab            | format               |                              |
    |         |         | pre-commit>=3.5.0     | lint                 |                              |
    |         |         | pytest-cov>=5.0.0     | pre-commit-install   |                              |
    |         |         | pytest>=8.2.2         | pre-commit-uninstall |                              |
    |         |         |                       | qa                   |                              |
    |         |         |                       | test                 |                              |
    |         |         |                       | test-no-cov          |                              |
    |         |         |                       | type                 |                              |
    +---------+---------+-----------------------+----------------------+------------------------------+
                                                    Matrices
    +------+---------+-------------+-----------------------+----------------------+---------------------------+
    | Name | Type    | Envs        | Dependencies          | Scripts              | Description               |
    +======+=========+=============+=======================+======================+===========================+
    | test | virtual | test.py3.9  | coverage[toml]>=7.5.3 | check                | Extended test environment |
    |      |         | test.py3.10 | jupyterlab            | extended             |                           |
    |      |         | test.py3.11 | pre-commit>=3.5.0     | format               |                           |
    |      |         | test.py3.12 | pytest-cov>=5.0.0     | lint                 |                           |
    |      |         | test.py3.13 | pytest-randomly       | pre-commit-install   |                           |
    |      |         |             | pytest-rerunfailures  | pre-commit-uninstall |                           |
    |      |         |             | pytest-xdist          | qa                   |                           |
    |      |         |             | pytest>=8.2.2         | test                 |                           |
    |      |         |             |                       | test-no-cov          |                           |
    |      |         |             |                       | type                 |                           |
    +------+---------+-------------+-----------------------+----------------------+---------------------------+
    

Enforcing Code Quality

  1. pre-commit tool is used to enforce code quality standards. pre-commit handles the installation of ruff, mypy, codespell, and others, on isolated environments. Even though it performs checks on the changes for every commit when installed (hatch run pre-commit-install), it is a good practice to run the checks on the whole codebase occasionally (when a new hook is added or on Pull Requests). You can do so by running hatch run check <hook-id>, for instance hatch run check nbstripout. Some of them are available as scripts as a syntax sugar, like hatch run lint, hatch run format, or hatch run type. They check the whole codebase using ruff, ruff-format, and mypy, respectively.

    • The file project.toml includes configuration for some of the tools, so they can be consumed from your IDE as well.
    • The file .pre-commit-config.yaml includes the configuration for the pre-commit hooks.
  2. The pytest test suite can be run from the default environment with hatch run test or hatch run test-no-cov (the latter without coverage check).

    Code examples on docstrings and documentation are tested by the doctest module (configured on the file pyproject.toml). It is integrated with pytest, so the previous test commands will also run the doctests.

  3. To run all the quality checks, you can use the command hatch run qa.

$ hatch run qa
cmd [1] | pre-commit run  --all-files
check for added large files..............................................Passed
check for case conflicts.................................................Passed
check docstring is first.................................................Passed
check json...............................................................Passed
check for merge conflicts................................................Passed
check toml...............................................................Passed
check yaml...............................................................Passed
debug statements (python)................................................Passed
detect private key.......................................................Passed
fix end of files.........................................................Passed
mixed line ending........................................................Passed
trim trailing whitespace.................................................Passed
ruff.....................................................................Passed
ruff-format..............................................................Passed
mypy.....................................................................Passed
codespell................................................................Passed
mdformat.................................................................Passed
nbstripout...............................................................Passed
zizmor...................................................................Passed
cmd [2] | pytest --cov --cov-report=term
========================================== test session starts ==========================================
platform darwin -- Python 3.12.7, pytest-8.3.5, pluggy-1.5.0
rootdir: /Users/fschuch/allcode/GitHub/PICASSO_sdg_tool
configfile: pyproject.toml
plugins: cov-6.0.0, anyio-4.8.0
collected 63 items

tests/unit/test_data.py .                                                                         [  1%]
tests/unit/test_indicators.py .............................................................       [ 98%]
tests/unit/test_units.py .                                                                        [100%]

---------- coverage: platform darwin, python 3.12.7-final-0 ----------
Name                                Stmts   Miss Branch BrPart   Cover
----------------------------------------------------------------------
sdg_tool/__init__.py                    2      0      0      0 100.00%
sdg_tool/indicators.py                324    119     34      2  58.38%
sdg_tool/main.py                       65     65     12      0   0.00%
sdg_tool/targets.py                    82     15      0      0  81.71%
sdg_tool/utils/xarray_accessor.py      27     13      4      0  45.16%
sdg_tool/utils/xarray_engine.py        52     11     16      6  75.00%
tests/__init__.py                       0      0      0      0 100.00%
tests/integration/__init__.py           0      0      0      0 100.00%
tests/unit/__init__.py                  0      0      0      0 100.00%
tests/unit/test_data.py                21      0      0      0 100.00%
tests/unit/test_indicators.py          16      2      0      0  87.50%
tests/unit/test_units.py                7      0      0      0 100.00%
----------------------------------------------------------------------
TOTAL                                 596    225     66      8  58.16%


========================================== 63 passed in 2.43s ===========================================
cmd [3] | echo '✅ QA passed'
✅ QA passed
  1. To step up in the game, an extended test environment and the command hatch run test:extended are available to verify the package on different Python versions and under different conditions thanks to the pytest plugins:

    • pytest-randomly that randomizes the test order;
    • pytest-rerunfailures that re-runs tests to eliminate intermittent failures;
    • pytest-xdist that parallelizes the test suite and reduce runtime, to help the previous points that increase the workload;
    • The file pyproject.toml includes configuration for them.

Continuous Integration

  • The workflow ci.yaml performs the verifications on every push and pull request, and deploys the package if running from a valid tag.
  • The workflow update-pre-commits.yaml is scheduled to run weekly to ensure the pre-commit hooks are up-to-date.
  • Dependabot is enabled to keep the dependencies up-to-date (dependabot.yml).

Managing the Changelog

The project relies on Automatically generated release notes from GitHub to handle the changelog. Refer to Managing labels and add the labels from release.yml to automatically organize your entries. This is a nice fit for Managing releases in a repository.

Some may argue about the importance of Keeping a Changelog on a dedicated file, but that results in frequent conflicts and merge issues to be solved when working with feature branches in parallel. The GitHub release notes are a good compromise, as they are automatically generated and can be edited before the release. Notice good Pull Request titles and small incremental changes are key to a good changelog.

Managing the Version Number

The version in the project is set dynamically by hatch-vcs. At installation and build time, the version is recovered from the version control system and exported to the file src/wizard_template/_version.py. In this way, there is no need to keep the version hard-coded on the codebase. You can use the command hatch version to check the current version. On the deployment workflow, the version is recovered from the tag and used to build the package.

The downside is that the new version number must be manually entered for each release, rather than using the command hatch version to bump it up. However, this is a reasonable trade-off, since it does not impose any restrictions on the project's versioning scheme nor branching model.

Publishing

The package can be published to PyPI in a general-purpose workflow that can be used for any branch model and versioning strategy.

The action ci.yaml is triggered to publish the package to PyPi when any valid tag is pushed to the repository. The tag matching pattern is set to v*.*.*, for instance, v1.2.3, v0.0.1rc2, v2023.2.0, etc. Notice all previous steps on CI are executed before the deployment, including static analysis and tests.

The process can also be triggered when a Release creates a new tag on your repo, so it synergies with the GitHub release notes cited above on the Changelog management.

Miscellaneous

VSCode Configuration

The template includes a .vscode folder with a extensions.json file that suggests the extensions to be installed on VSCode, in line with the quality assurance tools included in the template. It allows test, debug, auto-format, lint, and a few other functionalities to work directly on your IDE. It also includes a settings.json file that configures the Python extension to use the virtual environment created by Hatch. Remember to set hatch to use the virtual environment within the project folder hatch config set dirs.env.virtual .venv.

Communicate Type Annotations

PEP 561 is a guideline that explains how a Python package can show that it has inline type annotations. These annotations are useful for static type checkers, which are tools that help ensure your code has been error-free. Mypy checks for a file named py.typed in the root of the installed package.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages