diff --git a/docs/account_management/xias/pi_guest_management.md b/docs/account_management/xias/pi_guest_management.md index 50ecf1828..712b9419f 100644 --- a/docs/account_management/xias/pi_guest_management.md +++ b/docs/account_management/xias/pi_guest_management.md @@ -3,7 +3,7 @@ !!! note - These instructions are intended for use by UAB-employed PIs to organize external collaborators, also known as guests. UAB PIs: Please direct guests [here](guest_instructions.md) for instructions on creating their accounts. + These instructions are intended for use by UAB-employed PIs to organize external collaborators, also known as guests. UAB PIs: Please direct guests to our [Guest Instructions](guest_instructions.md) to create their accounts. diff --git a/docs/account_management/xias/pi_site_management.md b/docs/account_management/xias/pi_site_management.md index c3c2dea88..82a0d3b85 100644 --- a/docs/account_management/xias/pi_site_management.md +++ b/docs/account_management/xias/pi_site_management.md @@ -3,7 +3,7 @@ !!! note - These instructions are intended for use by UAB-employed PIs to organize external collaborators, also known as guests. UAB PIs: Please direct guests [here](guest_instructions.md) for instructions on creating their accounts. + These instructions are intended for use by UAB-employed PIs to organize external collaborators, also known as guests. UAB PIs: Please direct guests to our [Guest Instructions](guest_instructions.md) to create their accounts. XIAS Project/Sites, or simply sites, tie external users to specific resources at UAB. By connecting people to the resource they use, UAB can maintain security and accountability. Creating a site is the first step to giving access to external collaborators, and the process can be thought of as "create once, use many times". All sites must have an expiration date for security reasons. To create a site you'll need at least one Uniform Resource Identifier (URI) relating to resources used by the site. If you aren't sure what URI(s) to list for your site, please contact . diff --git a/docs/cheaha/getting_started.md b/docs/cheaha/getting_started.md index 6bde1bf0d..0c341a35f 100644 --- a/docs/cheaha/getting_started.md +++ b/docs/cheaha/getting_started.md @@ -175,10 +175,14 @@ Slurm is our job queueing software used for submitting any number of job scripts A large variety of software is available on Cheaha as modules. To view and use these modules see [the following documentation](./software/modules.md). -For new software installation, please try searching [Anaconda](../workflow_solutions/using_anaconda.md) for packages first. If you still need help, please [send a support ticket](../help/support.md) +For new software installation, please try searching for [`conda` packages](../workflow_solutions/using_conda.md#good-practice-for-finding-software-packages-on-conda) first. If you still need help, please [send a support ticket](../help/support.md) ### Conda Packages -A significant amount of open-source software is distributed as Anaconda or Python libraries. These libraries can be installed by the user without permission from Research Computing using Anaconda environments. To read more about using Anaconda virtual environments see our [Anaconda page](./software/software.md#anaconda-on-cheaha). +A significant amount of open-source research software is distributed as `conda` packages or Python libraries. These libraries can be installed by the user using `conda` environments. To read more about using `conda` environments see our [`conda` page](./software/software.md#conda-on-cheaha). -If the software installation instructions tell you to use either `conda install` or `pip install` commands, the software and its dependencies can be installed using a virtual environment. +If the software installation instructions tell you to use either `conda install` or `pip install` commands, the software and its dependencies can be installed using a Conda environment. + +## How to Get Help + +For questions, you can reach out via our various [channels](../help/support.md). diff --git a/docs/cheaha/open_ondemand/ood_jupyter_notebook.md b/docs/cheaha/open_ondemand/ood_jupyter_notebook.md index 4a1b3931c..9cc102408 100644 --- a/docs/cheaha/open_ondemand/ood_jupyter_notebook.md +++ b/docs/cheaha/open_ondemand/ood_jupyter_notebook.md @@ -1,6 +1,6 @@ # Jupyter Notebook -Jupyter Notebooks and [Jupyter Lab](./ood_jupyterlab.md) are both available as standalone apps in OOD. Jupyter is commonly used with Anaconda environments. If you are unfamiliar with Anaconda environments please see the [Working with Anaconda Environments section](#working-with-anaconda-environments) below before continuing here. +Jupyter Notebooks and [Jupyter Lab](./ood_jupyterlab.md) are both available as standalone apps in OOD. Jupyter is commonly used with Anaconda environments. If you are unfamiliar with Anaconda environments please see the [Working with Anaconda Environments section](#working-with-conda-environments) below before continuing here. To launch the Jupyter Notebook, select the menus 'Interactive Apps -> Jupyter Notebook'. The job creation and submission form appears: @@ -54,9 +54,9 @@ To work with other programming languages within Jupyter Notebook, you need to in Once the necessary kernels are installed, if you wish, you can write and run multiple code cells in different languages within a single notebook. Easily switch between kernels and select the preferred one for each language, and then proceed to run the code cells in their respective languages. -## Working With Anaconda Environments +## Working With Conda Environments -By default, Jupyter Notebooks will use the base environment that comes with the Anaconda3 module. This environment contains a large number of popular packages and may useful for something quick, dirty, and simple. However, for any analysis needing specific package versions or special packages, you will need to create your own environment and select it from the `Kernel` menu. For information on creating and managing Anaconda environments please see our [Using Anaconda page](../../workflow_solutions/using_anaconda.md). Then please review our [Cheaha-specific Anaconda page](../software/software.md#anaconda-on-cheaha) for important tips and how to avoid common pitfalls. +By default, Jupyter Notebooks will use the base environment that comes with the Anaconda3 module. This environment contains a large number of popular packages and may useful for something quick, dirty, and simple. However, for any analysis needing specific package versions or special packages, you will need to create your own environment and select it from the `Kernel` menu. For information on creating and managing Anaconda environments please see our [Using Anaconda page](../../workflow_solutions/using_conda.md). Then please review our [Cheaha-specific Anaconda page](../software/software.md#conda-on-cheaha) for important tips and how to avoid common pitfalls. To change the kernel, use the `Kernel` dropdown and select `Change Kernel`. From the list, choose the kernel corresponding to your desired Anaconda environment (see below for an example). If your environment isn't appearing, you may be missing the ipykernel package. To do so, use `conda install ipykernel` to get the `ipykernel` package installed into your environment, so Jupyter can recognize your environment. @@ -71,9 +71,9 @@ We can create a new environment, that houses all of the packages, modules, and l - [OOD Terminal](./ood_layout.md#opening-a-terminal). Be sure to run the following steps in a job! - [OOD HPC Desktop Job Terminal](./hpc_desktop.md). This method will ensure terminal commands are run in a job. -1. [Create](../../workflow_solutions/using_anaconda.md#create-an-environment) and [activate](../../workflow_solutions/using_anaconda.md#activate-an-environment) your new environment, following the linked steps. +1. [Create](../../workflow_solutions/using_conda.md#create-an-environment) and [activate](../../workflow_solutions/using_conda.md#activate-an-environment) your new environment, following the linked steps. -1. [Install your desired packages into your activated environment](../../workflow_solutions/using_anaconda.md#install-packages). +1. [Install your desired packages into your activated environment](../../workflow_solutions/using_conda.md#install-packages). 1. Remember to install `ipykernel` in your activated environment, using `conda install ipykernel`. @@ -146,9 +146,9 @@ Replace `python3.11` in the command with the appropriate Python version. Here's an example of the correct procedure for installing `pip` packages within a `conda`: 1. Load the `Anaconda3` module using `module load Anaconda3`. -1. Create or activate the desired Anaconda environment. Please refer to the [Anaconda documentation](../../workflow_solutions/using_anaconda.md#create-an-environment) +1. Create or activate the desired Anaconda environment. Please refer to the [Anaconda documentation](../../workflow_solutions/using_conda.md#create-an-environment) 1. Install `pip` within the `conda` environment using `conda install pip` or `conda install python`. `pip` and `python` are packaged together, installing one will always install the other. -1. Use `pip` when this `conda` environment is active to install packages. Please refer to [Installing packages with `pip`](../../workflow_solutions/using_anaconda.md#installing-packages-with-pip) +1. Use `pip` when this `conda` environment is active to install packages. Please refer to [Installing packages with `pip`](../../workflow_solutions/using_conda.md#installing-packages-with-pip) ### Tensorflow and PyTorch GPU Issues diff --git a/docs/cheaha/open_ondemand/ood_jupyterlab.md b/docs/cheaha/open_ondemand/ood_jupyterlab.md index be8390ffb..c555ac5c9 100644 --- a/docs/cheaha/open_ondemand/ood_jupyterlab.md +++ b/docs/cheaha/open_ondemand/ood_jupyterlab.md @@ -26,7 +26,7 @@ Please follow the instructions found in our [Working With Other Programming Lang ## Working With Conda Environments Using JupyterLab -By default, JupyterLab on Cheaha will launch using the `base` conda environment. While this default environment includes a wide range of popular packages and can be helpful for quick or exploratory tasks, it is not recommended for research workflows that require specific package versions or custom dependencies. For reproducible and stable analysis, it’s best to create and use a dedicated Conda environment tailored to your project or research workflow. Once created, you can register the environment as a Jupyter kernel and select it directly from within JupyterLab. For information on creating and managing Conda environments please see our [Using Anaconda page](../../workflow_solutions/using_anaconda.md). Then please review our [Cheaha-specific Anaconda page](../software/software.md#anaconda-on-cheaha) for important tips and how to avoid common pitfalls. +By default, JupyterLab on Cheaha will launch using the `base` conda environment. While this default environment includes a wide range of popular packages and can be helpful for quick or exploratory tasks, it is not recommended for research workflows that require specific package versions or custom dependencies. For reproducible and stable analysis, it’s best to create and use a dedicated Conda environment tailored to your project or research workflow. Once created, you can register the environment as a Jupyter kernel and select it directly from within JupyterLab. For information on creating and managing Conda environments please see our [Using Anaconda page](../../workflow_solutions/using_conda.md). Then please review our [Cheaha-specific Anaconda page](../software/software.md#conda-on-cheaha) for important tips and how to avoid common pitfalls. ![! Landing page of JupyterLab when you launch the Interactive session](images/ood_jupyterlab_landingpage.png) diff --git a/docs/cheaha/open_ondemand/ood_matlab.md b/docs/cheaha/open_ondemand/ood_matlab.md index 87888d653..b5991a03d 100644 --- a/docs/cheaha/open_ondemand/ood_matlab.md +++ b/docs/cheaha/open_ondemand/ood_matlab.md @@ -10,17 +10,17 @@ Matlab is available for use graphically in your browser via OOD. As with other s Matlab tends to consume substantial memory at startup. You may experience difficulty with job errors below 20 GB of total memory. -## Using Anaconda Python From Within MATLAB +## Using Conda Python From Within MATLAB Matlab has the ability to interoperate with Python from within Matlab. The official documentation for this feature may be found at . -This section is dedicated to using this feature with Anaconda on Cheaha. To use Python contained in an Anaconda Environment within Matlab, please use the following steps. +This section is dedicated to using this feature with `conda` on Cheaha. To use Python contained in a `conda` Environment within Matlab, please use the following steps. 1. Create an [HPC Interactive Desktop Job](hpc_desktop.md). 1. Open a terminal in that job. The following steps should all be run in this terminal unless otherwise specified. -1. [Load the Anaconda Module](../software/software.md#loading-anaconda). -1. [Create an Environment](../../workflow_solutions/using_anaconda.md#create-an-environment) in Anaconda with the packages needed. -1. [Activate the Environment](../../workflow_solutions/using_anaconda.md#activate-an-environment), +1. [Load the `conda` Module](../software/software.md#loading-conda). +1. [Create an Environment](../../workflow_solutions/using_conda.md#create-an-environment) in `conda` with the packages needed. +1. [Activate the Environment](../../workflow_solutions/using_conda.md#activate-an-environment), 1. Load the Matlab [Module](../software/modules.md). 1. Start Matlab by entering the command `matlab`. 1. Verify success by entering `pyenv` at the Matlab prompt (not the terminal window). Multiple lines of text will be returned at the prompt. Among them you should see a line like the following, with your environment name in place of ``. diff --git a/docs/cheaha/open_ondemand/ood_rstudio.md b/docs/cheaha/open_ondemand/ood_rstudio.md index a688da68c..c9d4dca3e 100644 --- a/docs/cheaha/open_ondemand/ood_rstudio.md +++ b/docs/cheaha/open_ondemand/ood_rstudio.md @@ -12,16 +12,16 @@ RStudio is available for use graphically in your browser via OOD. As with other ## RStudio and Python -If you have a workflow that uses both R and Python, it is strongly recommended to use the [reticulate](https://rstudio.github.io/reticulate/) package along with Anaconda environments. Reticulate allows researchers to load Python packages into a native R session as objects. For instance, if someone prefer some functionality of the `pandas` package but has other code already written in R, they can import `pandas` to R and use both simultaneously. +If you have a workflow that uses both R and Python, it is strongly recommended to use the [reticulate](https://rstudio.github.io/reticulate/) package along with `conda` environments. Reticulate allows researchers to load Python packages into a native R session as objects. For instance, if someone prefer some functionality of the `pandas` package but has other code already written in R, they can import `pandas` to R and use both simultaneously. -This also allows researchers to download precompiled command line binaries into an Anaconda environment and easliy use them in their R scripts. +This also allows researchers to download precompiled command line binaries into a `conda` environment and easily use them in their R scripts. For setup, use the following steps: 1. In a terminal on a compute node, either in an HPC Desktop job or by clicking the blue Host button on any job card: - 1. Load the `Anaconda3` module - 1. Create an Anaconda environment. More information about how to create Anaconda environments can be found [in our documentation](../../workflow_solutions/using_anaconda.md). + 1. Load the `Miniforge3` module + 1. Create a `conda` environment. More information about how to create `conda` environments can be found [in our documentation](../../workflow_solutions/using_conda.md). 1. Activate your environment and install your requuired python packages using either `pip install` or `conda install` depending on the package source. @@ -32,16 +32,16 @@ For setup, use the following steps: 1. In RStudio: - 1. Add the command `module load Anaconda3` to the Environment Setup window when requesting the RStudio job. + 1. Add the command `module load Miniforge3` to the Environment Setup window when requesting the RStudio job. 1. If not already installed, install the `reticulate` package using either `install.packages` or the [renv](#rstudio-projects-and-renv) package. - 1. Use `reticulate::use_condaenv('env_name')` to load your conda environment. - 1. From here, you will be able to interact with all of the python packages and non-python precompiled binaries in your Anaconda environment using R and RStudio. Please read more about how to do that in [reticulate's documentation](https://rstudio.github.io/reticulate/#importing-python-modules). + 1. Use `reticulate::use_condaenv('ENVIRONMENT')` to load your conda environment which has the name `ENVIRONMENT`. + 1. From here, you will be able to interact with all of the python packages and non-python precompiled binaries in your `conda` environment using R and RStudio. Please read more about how to do that in [reticulate's documentation](https://rstudio.github.io/reticulate/#importing-python-modules). -For cases where your R code only needs access to precompiled binaries or libraries and does not need to import any Python libraries, you can instead create your Anaconda environment and add the following lines into the Environment Setup window: +For cases where your R code only needs access to precompiled binaries or libraries and does not need to import any Python libraries, you can instead create your `conda` environment and add the following lines into the Environment Setup window: ``` bash -module load Anaconda3 -conda activate +module load Miniforge3 +conda activate ENVIRONMENT ``` This will add those binaries and libraries to your environment `$PATH` which RStudio will inherit. diff --git a/docs/cheaha/slurm/gpu.md b/docs/cheaha/slurm/gpu.md index 512d19f3f..66f4c7259 100644 --- a/docs/cheaha/slurm/gpu.md +++ b/docs/cheaha/slurm/gpu.md @@ -128,13 +128,13 @@ You will need to load a CUDA module to make use of GPUs on Cheaha. Depending on module -r spider 'CUDA/*' ``` -As of 2025-02-25, we offer CUDA modules up to version `12.6.0`. If you need a newer version, please use [Conda](../software/software.md#anaconda-on-cheaha) to install CUDA software from the `conda-forge` channel. The packages and relevant commands to install into your newly created [environment](../../workflow_solutions/using_anaconda.md#create-an-environment) are available at the URLs in the following table. Note that you should use different packages depending on the CUDA version you need for your software! +As of 2025-02-25, we offer CUDA modules up to version `12.6.0`. If you need a newer version, please use [Conda](../software/software.md#conda-on-cheaha) to install CUDA software from the `conda-forge` channel. The packages and relevant commands to install into your newly created [environment](../../workflow_solutions/using_conda.md#create-an-environment) are available at the URLs in the following table. Note that you should use different packages depending on the CUDA version you need for your software! {{ read_csv('cheaha/slurm/res/cuda_conda_package_versions.csv', keep_default_na=False) }} If you are working with deep neural networks (DNNs, CNNs, LSTMs, LLMs, AI, etc.), you will also need to load a `cuDNN`. The `cuDNN` modules are built to be compatible with a sibling `CUDA` module and are named with the corresponding `CUDA` version. For example, if you are loading `CUDA/12.2.0`, you will also need to load `cuDNN/8.9.2.26-CUDA-12.2.0`. Note the trailing `12.2.0`. -As of 2025-02-25, we offer cuDNN modules for CUDA up to `12.3.0`. If you need a newer version, please use [Conda](../software/software.md#anaconda-on-cheaha) to install cuDNN software from the `conda-forge` channel. More details on how to install `CUDA` and `cuDNN` into your [environment](../../workflow_solutions/using_anaconda.md#create-an-environment) are available at . +As of 2025-02-25, we offer cuDNN modules for CUDA up to `12.3.0`. If you need a newer version, please use [Conda](../software/software.md#conda-on-cheaha) to install cuDNN software from the `conda-forge` channel. More details on how to install `CUDA` and `cuDNN` into your [environment](../../workflow_solutions/using_conda.md#create-an-environment) are available at . ### CUDA Compute Capability and Known Issues @@ -160,7 +160,7 @@ To check which CUDA Module version is required for your version of Tensorflow, s PyTorch does not maintain a simple compatibility table for CUDA versions. Instead, please manually check their ["get started" page](https://pytorch.org/get-started/locally/#start-locally) for the latest PyTorch version compatibility, and their ["previous versions" page](https://pytorch.org/get-started/previous-versions/) for older PyTorch version compatibility. Assume that a CUDA version is not compatible if it is not listed for a specific PyTorch version. -To use GPUs prior to PyTorch version 1.13 you _must_ select a `cudatoolkit` version from the PyTorch channel when you install PyTorch using Anaconda. It is how PyTorch knows to install a GPU compatible flavor, as opposed to the CPU only flavor. See below for templates of CPU and GPU installs for PyTorch versions prior to 1.13. Be sure to check the compatibility links above for your selected version. Note `torchaudio` is also available for signal processing. +To use GPUs prior to PyTorch version 1.13 you _must_ select a `cudatoolkit` version from the PyTorch channel when you install PyTorch using `conda`. It is how PyTorch knows to install a GPU compatible flavor, as opposed to the CPU only flavor. See below for templates of CPU and GPU installs for PyTorch versions prior to 1.13. Be sure to check the compatibility links above for your selected version. Note `torchaudio` is also available for signal processing. - CPU Version: `conda install pytorch==... torchvision==... -c pytorch` - GPU Version: `conda install pytorch==... torchvision==... cudatoolkit=... -c pytorch` diff --git a/docs/cheaha/slurm/slurm_tutorial.md b/docs/cheaha/slurm/slurm_tutorial.md index 7b9e5c175..08650592d 100644 --- a/docs/cheaha/slurm/slurm_tutorial.md +++ b/docs/cheaha/slurm/slurm_tutorial.md @@ -131,15 +131,15 @@ This example illustrate a Slurm job that runs a Python script involving [NumPy]( #SBATCH --output=%x_%j.out ### Slurm Output file, %x is job name, %j is job id #SBATCH --error=%x_%j.err ### Slurm Error file, %x is job name, %j is job id -### Loading Anaconda3 module to activate `pytools-env` conda environment -module load Anaconda3 +### Loading Miniforge3 module to activate `pytools-env` conda environment +module load Miniforge3 conda activate pytools-env ### Run the script `python_test.py` python python_test.py ``` -The batch job requires an input file `python_test.py` (line 17) for execution. Copy the input file from the [Containers page](../../workflow_solutions/getting_containers.md#create-your-own-docker-container). Place this file in the same folder as the `numpy.job`. This python script performs numerical integration and data visualization tasks, and it relies on the following packages: numpy, matplotlib, scipy for successful execution. These dependencies can be installed using [Anaconda](../../workflow_solutions/using_anaconda.md) within a `conda` environment named `pytools-env`. Prior to running the script, load the `Anaconda3` module and activate the `pytools-env` environment (line 13 and 14). Once job is successfully completed, check the slurm output file for results. Additionally, a plot named `testing.png` will be generated. + The batch job requires an input file `python_test.py` (line 17) for execution. Copy the input file from the [Containers page](../../workflow_solutions/getting_containers.md/#create-your-own-docker-container). Place this file in the same folder as the `numpy.job`. This python script performs numerical integration and data visualization tasks, and it relies on the following packages: numpy, matplotlib, scipy for successful execution. These dependencies can be installed using [`conda`](../../workflow_solutions/using_conda.md) within a `conda` environment named `pytools-env`. Prior to running the script, load the `Miniforge3` module and activate the `pytools-env` environment (line 13 and 14). Once job is successfully completed, check the slurm output file for results. Additionally, a plot named `testing.png` will be generated. ```bash $ ls @@ -187,8 +187,8 @@ Multiple jobs or tasks can be executed simultaneously using `srun` within a sing #SBATCH --output=%x_%j.out ### Slurm Output file, %x is job name, %j is job id #SBATCH --error=%x_%j.err ### Slurm Error file, %x is job name, %j is job id -### Loading Anaconda3 module to activate `pytools-env` conda environment -module load Anaconda3 +### Loading Miniforge3 module to activate `pytools-env` conda environment +module load Miniforge3 conda activate pytools-env ### Runs the script `python_test.py` in parallel with distinct inputs and ensures synchronization @@ -330,8 +330,8 @@ The python script (line 24) runs individual array task concurrently on respectiv #SBATCH --error=logs/%x_%A_%a.err #SBATCH --array=0-2 ### Array job with 3 tasks (indexed from 0 to 2) -### Loading Anaconda3 module to activate `pytools-env` conda environment -module load Anaconda3 +### Loading Miniforge3 module to activate `pytools-env` conda environment +module load Miniforge3 conda activate pytools-env ### Calculate the input range for this task @@ -765,7 +765,7 @@ if gpus: print(matmul_sum) ``` -We will also need to set up a [Conda environment](../software/software.md#anaconda-on-cheaha) suitable for executing this Tensorflow-based code. Please do not try to install Pip packages outside of a Conda environment, as it can result in [hard-to-diagnose errors](../../workflow_solutions/using_anaconda.md#). Copy the following into a file `environment.yml`. +We will also need to set up a [Conda environment](../software/software.md#conda-on-cheaha) suitable for executing this Tensorflow-based code. Please do not try to install Pip packages outside of a Conda environment, as it can result in [hard-to-diagnose errors](../../workflow_solutions/using_conda.md#). Copy the following into a file `environment.yml`. ```yaml name: tensorflow @@ -776,7 +776,7 @@ dependencies: - tensorflow==2.15.0 ``` -To create the environment, run the following commands. This is a one-time setup for this tutorial. Please see our [Module page](../software/modules.md) and our [Conda page](../software/software.md#anaconda-on-cheaha) for more information about each. +To create the environment, run the following commands. This is a one-time setup for this tutorial. Please see our [Module page](../software/modules.md) and our [Conda page](../software/software.md#conda-on-cheaha) for more information about each. ```bash module load Anaconda3 diff --git a/docs/cheaha/slurm/submitting_jobs.md b/docs/cheaha/slurm/submitting_jobs.md index e469ee70a..3dd158d12 100644 --- a/docs/cheaha/slurm/submitting_jobs.md +++ b/docs/cheaha/slurm/submitting_jobs.md @@ -163,7 +163,7 @@ For a practical example with dynamic indices, please visit our [Practical `sbatc Jobs should be submitted to the Slurm job scheduler either using a [batch job](#batch-jobs-with-sbatch) or an [Open OnDemand (OOD) interactive job](../open_ondemand/index.md). -You can use `srun` for working on short interactive tasks such as [creating an Anaconda environment](../../workflow_solutions/using_anaconda.md) and running [parallel tasks](#srun-for-running-parallel-jobs) within an sbatch script. +You can use `srun` for working on short interactive tasks such as [creating a `conda` environment](../../workflow_solutions/using_conda.md) and running [parallel tasks](#srun-for-running-parallel-jobs) within an sbatch script. !!! warning diff --git a/docs/cheaha/software/res/common_software.csv b/docs/cheaha/software/res/common_software.csv index d5989237b..b23f84b4a 100644 --- a/docs/cheaha/software/res/common_software.csv +++ b/docs/cheaha/software/res/common_software.csv @@ -1,5 +1,5 @@ Name ,Description -Anaconda3 ,"Software that can install the Python language, Python packages, and other research software. Learn more about using Anaconda at our [Anaconda on Cheaha page](software.md#anaconda-on-cheaha). You may be interested in our [OpenOnDemand Jupyter Notebook interactive app](../open_ondemand/ood_jupyter_notebook.md)." +Miniforge3 ,"Software with `conda` that can install the Python language, Python packages, and other research software. Learn more about using `conda` at our [`conda` on Cheaha page](software.md#conda-on-cheaha). You may be interested in our [OpenOnDemand Jupyter Notebook interactive app](../open_ondemand/ood_jupyter_notebook.md)." "CUDA, cuDNN" ,Libraries for developing and using deep learning and AI models with NVidia GPUs. Commonly used with TensorFlow and PyTorch. See our [GPU page](../slurm/gpu.md) for more information. "Mathematica" ,Mathematical CAS and numerical computing software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). Matlab ,Matlab language and development environment. We recommend using our [Open OnDemand Matlab interactive app](../open_ondemand/ood_matlab.md). diff --git a/docs/cheaha/software/software.md b/docs/cheaha/software/software.md index 060a4366e..b760c736a 100644 --- a/docs/cheaha/software/software.md +++ b/docs/cheaha/software/software.md @@ -1,14 +1,14 @@ # Software Installation -## Anaconda on Cheaha +## Conda on Cheaha -For additional general information on using Anaconda please see [Anaconda Environments](../../workflow_solutions/using_anaconda.md). +For additional general information on using `conda` please see our [Using `conda` page](../../workflow_solutions/using_conda.md). -If you are using Jupyter Notebook, please see our section on [Packages for Jupyter](../../workflow_solutions/using_anaconda.md#jupyter-package-management). +If you are using Jupyter Notebook, please see our section on [Packages for Jupyter](../../workflow_solutions/using_conda.md#packages-for-jupyter). -### Loading Anaconda +### Loading Conda -Anaconda is installed on Cheaha as a family of modules, and does not need to be installed by Researchers. Instead, the most recent version of Anaconda installed on Cheaha may be loaded using the command `module load Anaconda3`. Other versions may be discovered using the command `module avail Anaconda`. We recommend always using the latest version. +`conda` is installed on Cheaha as a family of modules, and does not need to be installed by Researchers. Instead, the most recent version of `conda` installed on Cheaha may be loaded using the command `module load Miniforge3`. !!! note @@ -16,37 +16,66 @@ Anaconda is installed on Cheaha as a family of modules, and does not need to be If you are using [Open OnDemand Jupyter Notebook](../open_ondemand/ood_jupyter_notebook.md) you do not need to use the `module load` command as part of creating the job. -### Using Anaconda +### Using Conda -Anaconda on Cheaha works like it does on any other system, once the module has been loaded, with a couple of important differences in the callouts below. +Once you have loaded the Miniforge module, `conda` on Cheaha works similarly to how it does on other computers. There are a couple of important differences in the callouts below. !!! note - The `base` environment is installed in a shared location and cannot be modified by researchers. Other environments are installed in your home directory by default. + The `base` environment is installed in a shared location and cannot be modified by researchers. Other environments are installed in your home directory by default at `/home/$USER/.conda/`. !!! important - Only create environments on compute nodes. Anaconda environment creation consumes substantial resources and should not be run on the login node. + Only create `conda` environments on compute nodes. Environment creation consumes substantial resources and should not be run on the login node. !!! warning - The Cheaha operating system has a version of Python installed. This version is used by `python` calls when Anaconda has not been loaded. This can cause unexpected errors. Be sure you've loaded the Anaconda environment you need before using Python. + The Cheaha operating system has a built-in Python version installed. This version is used by `python` calls when Miniforge has not been loaded. This can cause unexpected errors. Be sure you've loaded the Miniforge module before using Python. !!! danger - Do not use `conda init` on Cheaha! Anaconda is managed as a [module](./modules.md), including script setup. Using `conda init` at any point can cause hard-to-diagnose issues with [Open OnDemand Interactive Jobs](../open_ondemand/ood_layout.md#interactive-apps). Please see [this ask.ci FAQ](https://ask.cyberinfrastructure.org/t/why-do-i-get-an-error-when-launching-an-open-ondemand-hpc-interactive-session/2496/2) for how to undo what `conda init` does. + Do not use `conda init` on Cheaha, even if prompted to do so! + + `conda` is managed on Cheaha via the [module](./modules.md) `Miniforge3`, including script setup. Using `conda init` at any point can cause hard-to-diagnose issues with [Open OnDemand Interactive Jobs](../open_ondemand/ood_layout.md#interactive-apps). Please see [this ask.ci FAQ](https://ask.cyberinfrastructure.org/t/why-do-i-get-an-error-when-launching-an-open-ondemand-hpc-interactive-session/2496/2) for how to undo what `conda init` does. + + If the `conda` software instructs you to use `conda init` while on Cheaha, please ignore it to avoid future issues with [Open OnDemand](../open_ondemand/index.md). + - If the Anaconda software instructs you to use `conda init` while on Cheaha, please ignore it to avoid future issues with [Open OnDemand](../open_ondemand/index.md). +!!! danger + + Using `pip install` without loading Miniforge3 will cause hard-to-diagnose errors and broken workflows. + + Using `pip install` in the `base` environment will cause the same hard-to-diagnose errors and broken workflows. + + Read more about this issue, and how to resolve it, at our [Installing pip Packages Section](#installing-pip-packages-outside-of-your-environments). + + +For more information on usage with examples, see [Conda Environments](../../workflow_solutions/using_conda.md). Need some hands-on experience? You can find instructions on how to install PyTorch and TensorFlow using Conda in this [tutorial](../tutorial/pytorch_tensorflow.md). + +### Installing Pip Packages Outside of Your Environments + +When installing packages within a `conda` environment using `pip`, it's crucial to ensure that you install `pip` within the same conda environment and use `pip` from that environment. If `pip` is used outside of `conda` or within an environment without `pip` installed, the packages are installed to `~/.local`. This can lead to unexpected package conflicts, as Python loads packages from `~/.local` before loading from `conda` environments, and shows the following error, + +```bash +Requirement already satisfied: numpy in /home/$USER/.local/lib/python3.11/site-packages (1.26.3) +``` + +For the above case, resolving errors involve deleting the `~/.local` directory. + +Here's an example of the correct procedure for installing `pip` packages within a `conda`: -For more information on usage with examples, see [Anaconda Environments](../../workflow_solutions/using_anaconda.md). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using Anaconda in this [tutorial](../tutorial/pytorch_tensorflow.md). +1. Load the `Miniforge` module using `module load Miniforge3`. +1. Create or activate the desired `conda` environment. Please refer to the [`conda` documentation](../../workflow_solutions/using_conda.md#create-an-environment) +1. Install `pip` within the `conda` environment using `conda install pip` or `conda install python`. `pip` and `python` are packaged together, installing one will always install the other. +1. Use `pip` when this `conda` environment is active to install packages. Please refer to [Installing packages with `pip`](../../workflow_solutions/using_conda.md#installing-packages-with-pip) ### Obtaining the Latest CUDA and cuDNN Modules diff --git a/docs/cheaha/tutorial/index.md b/docs/cheaha/tutorial/index.md index 13b618af9..4e178d1b9 100644 --- a/docs/cheaha/tutorial/index.md +++ b/docs/cheaha/tutorial/index.md @@ -1,9 +1,16 @@ -# Getting Started With Using Anaconda on Cheaha +# Cheaha Tutorials -Python is a high level programming language that is widely used in many branches of science. As a result, many scientific packages have been developed in Python, leading to the development of a package manager called Anaconda. Anaconda is a widely used Python package manager for scientific research. Consequently Anaconda is used on Cheaha for managing environments and packages. +## Conda on Cheaha -Have you encountered problems while using Anaconda on Cheaha? We have provided this page to curate a number of walkthroughs on how you can address majority of the needs you may have or challenges you may experience using Anaconda on Cheaha. +Python is a high level programming language that is widely used in many branches of science. As a result, many scientific packages have been developed in Python, leading to the development of a package manager called [`conda`](../../workflow_solutions/using_conda.md). `conda` is a widely used Python package manager for scientific research. Consequently `conda` is used on Cheaha for managing environments and packages. -Below is a list of Tutorials we currently have Using Anaconda on Cheaha; +Have you encountered problems while using `conda` on Cheaha? We have provided this page to curate a number of walkthroughs on how you can address majority of the needs you may have or challenges you may experience using `conda` on [Cheaha](../getting_started.md). -1. [Using PyTorch and TensorFlow with Anaconda on Cheaha](../tutorial/pytorch_tensorflow.md) +1. [Using Conda to install and run PyTorch and TensorFlow](../tutorial/pytorch_tensorflow.md). +1. [Using Conda as part of parallel Slurm workflows](../slurm/slurm_tutorial.md). + +## Using Slurm + +[Slurm](../slurm/introduction.md) is the job scheduler used on [Cheaha](../getting_started.md) that manages which work runs on which resources. Jobs are created when researchers interact with Slurm to request resources on which to run their research software. + +1. [Parallel Slurm Workflows](../slurm/slurm_tutorial.md). diff --git a/docs/cheaha/tutorial/pytorch_tensorflow.md b/docs/cheaha/tutorial/pytorch_tensorflow.md index db1051e73..9588cd838 100644 --- a/docs/cheaha/tutorial/pytorch_tensorflow.md +++ b/docs/cheaha/tutorial/pytorch_tensorflow.md @@ -1,6 +1,6 @@ -# Anaconda Environment Tutorial for PyTorch and TensorFlow +# Conda Environment Tutorial for PyTorch and TensorFlow -The below tutorial would show you steps on how to create an Anaconda environment, activate, and install libraries/packages for machine and deep learning (PyTorch and Tensorflow) using an Anaconda environment on Cheaha. There are also steps on how to access the terminal, as well as using Jupyter Notebook's Graphical User Interface (GUI) to work with these Anaconda environments. There are detailed steps here to guide your creation of a [Jupyter Notebook job.](../open_ondemand/ood_layout.md#interactive-apps) +The below tutorial will show you how to create a `conda` environment, activate, and install libraries/packages for machine and deep learning (PyTorch and Tensorflow) using a `conda` environment on Cheaha. There are also steps on how to access the terminal, as well as using Jupyter Notebook's Graphical User Interface (GUI) to work with `conda` environments. There are detailed steps here to guide your creation of a [Jupyter Notebook job.](../open_ondemand/ood_layout.md#interactive-apps) !!! note @@ -14,7 +14,7 @@ The below tutorial would show you steps on how to create an Anaconda environment Be mindful that there are special considerations when submitting GPU jobs to maximize performance. See [Making the Most of GPUs](../slurm/gpu.md#making-the-most-of-gpus) for more information. This is not necessary for the tutorial in this page, but may benefit your research computation. -## Installing Anaconda Environments Using the Terminal +## Installing Conda Environments Using the Terminal To access the terminal (shell), please do the following. @@ -50,7 +50,18 @@ Please see our [Training Resources page](../../education/training_resources.md#t There are two instances of PyTorch that can be installed, one requiring GPUs, and another utilizing only CPUs. The use of GPUs improve compute speeds and are preferred. For both instances of pytorch, please follow these steps; -1. For a correct installation of pytorch using GPUs, we have to ensure some conditions are met (Request a partition that has GPUs, and then load the requisite `CUDA` module). You can see our [documentation](../hardware.md#details) for details on our Partition offerings. The command below shows how to load the CUDA toolkit in your environment setup form (see image below) when requesting for resources. +1. [Create](../../workflow_solutions/using_conda.md#create-an-environment) and [activate](../../workflow_solutions/using_conda.md#activate-an-environment) an environment as stated in these links. + +1. Access the terminal following the steps at our [Installing Conda Environments Using the Terminal section](#installing-conda-environments-using-the-terminal). + + +!!! note + + When installing packages, modules and libraries into environments, remember to also install `ipykernel` using `conda install ipykernel`. This way your activated environment would appear in the list of kernels in your Jupyter Notebook. + + + +For a correct installation of pytorch, we have to ensure some conditions are met. See partition [docs](../hardware.md#details) for a guide. One of such conditions, is to load CUDA toolkit using the below command in your environment setup form (see image below). ```bash module load CUDA/11.8.0 @@ -66,16 +77,16 @@ module load CUDA/11.8.0 ![!nvidia-smi output](images/CudaVersion.png) -1. [Access the terminal](#installing-anaconda-environments-using-the-terminal). +1. [Access the terminal](#installing-conda-environments-using-the-terminal). -1. [Create an Environment](../../workflow_solutions/using_anaconda.md#create-an-environment) and [Activate the Environment](../../workflow_solutions/using_anaconda.md#activate-an-environment). +1. [Create an Environment](../../workflow_solutions/using_conda.md#create-an-environment) and [Activate the Environment](../../workflow_solutions/using_conda.md#activate-an-environment). !!! note When installing packages, modules, and libraries into environments, remember to also install `ipykernel` using `conda install ipykernel`. The `ipykernel` package must be installed for Jupyter to find it. - See our [Working With Anaconda Enironments section](../../cheaha/open_ondemand/ood_jupyter_notebook.md#working-with-anaconda-environments) for how to switch environments in [Jupyter Noteooks](../../cheaha/open_ondemand/ood_jupyter_notebook.md). + See our [Working With Conda Enironments section](../../cheaha/open_ondemand/ood_jupyter_notebook.md#working-with-conda-environments) for how to switch environments in [Jupyter Noteooks](../../cheaha/open_ondemand/ood_jupyter_notebook.md). @@ -161,4 +172,4 @@ The image below shows an output that the TensorFlow library will utilize the ava The information (I) and warning (W) outputs notifies you of the installed Tensorflow binary and how it would function. The I output informs you that the installed Tensorflow library will utilize your CPU for additional speed when GPUs are not the most efficient way to do processing for these operations. The W output tells you TensorRT is not available, please note TensorRT is not currently supported on our systems. -Now that you have completed the tutorial, you can find more Anaconda information here, [Using Anaconda page](../../workflow_solutions/using_anaconda.md#anaconda). +Now that you have completed the tutorial, you can find more `conda` information at our [Using `conda` page](../../workflow_solutions/using_conda.md#why-use-conda). diff --git a/docs/contributing/contributor_guide.md b/docs/contributing/contributor_guide.md index 10127f791..79aae0641 100644 --- a/docs/contributing/contributor_guide.md +++ b/docs/contributing/contributor_guide.md @@ -38,7 +38,7 @@ We are using Visual Studio Code (VSCode) for development with several extensions VSCode may be obtained from [Visual Studio Code](https://code.visualstudio.com/) and documentation is available at [VSCode: Docs](https://code.visualstudio.com/docs). The extensions should automatically show up as recommendations when opening the repo, or they can be downloaded using the VSCode Extensions menu (++ctrl+shift+x++ on Windows or ++command+shift+x++ on Mac). -We assume you have a `conda` distribution on your local machine. If you are affiliated with UAB, please install [Miniforge](https://conda-forge.org/miniforge/). For detailed installation instructions, see here: . For more information on using `conda`, see our [Anaconda page](../workflow_solutions/using_anaconda.md). +We assume you have a `conda` distribution on your local machine. If you are affiliated with UAB, please install [Miniforge](https://conda-forge.org/miniforge/) and _do not_ install Anaconda nor Miniconda. For more information on why, please see our [Conda Migration FAQ](../workflow_solutions/conda_migration_faq.md#why-do-i-need-to-stop-using-anaconda). For detailed installation instructions on installing Miniforge, see here: . For more information on using `conda`, see our [`conda` page](../workflow_solutions/using_conda.md). ### Style Guide @@ -227,7 +227,7 @@ You'll need to add, remove or otherwise modify files as appropriate to implement ##### Verify Your Changes -1. [Activate](../workflow_solutions/using_anaconda.md#activate-an-environment) your conda environment. +1. [Activate](../workflow_solutions/using_conda.md#activate-an-environment) your conda environment. 1. Open the file `test.py` in the repository to start the Python extension. 1. Select the interpreter using 1. Open a VSCode terminal using ++ctrl+shift+grave++. @@ -284,7 +284,7 @@ We will do our best to check information for accuracy, as well as proofread the - Main headings are based on [UAB Research Computing services](https://www.uab.edu/it/home/research-computing/research-digital-marketplace) - Favor placing new pages and information into an existing navigation section over creating a new section. - Approach documentation from a problem solving angle rather than a technology. Examples: - - Section title "Installing Software Yourself with Anaconda" vs "Anaconda" + - Section title "Installing Software Yourself with `conda`" vs "`conda`" - Section title "Running Analysis Jobs" vs "Slurm" - Add redirects for any pages that move, in case someone has bookmarked a page, see [redirects](#redirects) - Images for a given page must be placed in a directory called `images/` at the same level as the page itself. diff --git a/docs/data_management/lts/iam_and_policies.md b/docs/data_management/lts/iam_and_policies.md index 7dcbf2c26..411787084 100644 --- a/docs/data_management/lts/iam_and_policies.md +++ b/docs/data_management/lts/iam_and_policies.md @@ -78,7 +78,7 @@ Owners can assign stewards either when requesting LTS allocation creation or at !!! important - Do not share your access and secret keys with anyone! See [here](#key-handling-and-ownership) for more information. + Do not share your access and secret keys with anyone! See [Key Handling and Ownership](#key-handling-and-ownership) for more information. There are multiple ways to share data with LTS: diff --git a/docs/data_management/lts/interfaces.md b/docs/data_management/lts/interfaces.md index a73ce1c5e..906860e28 100644 --- a/docs/data_management/lts/interfaces.md +++ b/docs/data_management/lts/interfaces.md @@ -35,16 +35,16 @@ While globus is the recommended tool for most data transfers, command line tools ### Installation of `s3cmd` and `s5cmd` on Cheaha -To install the tools on Cheaha, you can request a compute node through Cheaha's [Open OnDemand web portal](../../cheaha/open_ondemand/ood_layout.md#creating-an-interactive-job).Once your job is launched, open a terminal to execute the commands listed below. You do not need to install both tools if they aren't necessary. Both are available to install into [Anaconda](../../workflow_solutions/using_anaconda.md) environments. It's suggested to create a single environment named `s3` and install both s3cmd and s5cmd into it for easy access to both tools. Specific install and usage commands for each are given in their respective sections. You can create the general environment using the following commands: +To install the tools on Cheaha, you can request a compute node through Cheaha's [Open OnDemand web portal](../../cheaha/open_ondemand/ood_layout.md/#creating-an-interactive-job).Once your job is launched, open a terminal to execute the commands listed below. You do not need to install both tools if they aren't necessary. Both are available to install into [`conda`](../../workflow_solutions/using_conda.md) environments. It's suggested to create a single environment named `s3` and install both s3cmd and s5cmd into it for easy access to both tools. Specific install and usage commands for each are given in their respective sections. You can create the general environment using the following commands: ``` bash -module load Anaconda3 +module load Miniforge3 conda create -n s3 -c conda-forge pip s5cmd conda activate s3 pip install s3cmd ``` -Please note that the instructions mentioned above are specific to the Cheaha system. To transfer data between your individual computer and LTS, you will need to install `s3cmd` or `s5cmd` on your machine. Please refer to this [section](#installation-of-s3cmd-and-s5cmd-on-individual-systems-without-anaconda) for installation instructions specific to your operating system. +Please note that the instructions mentioned above are specific to the Cheaha system. To transfer data between your individual computer and LTS, you will need to install `s3cmd` or `s5cmd` on your machine. Please refer to this [section](#installation-of-s3cmd-and-s5cmd-on-individual-systems-without-conda) for installation instructions specific to your operating system. !!! note @@ -54,7 +54,7 @@ Please note that the instructions mentioned above are specific to the Cheaha sys ### `s3cmd` -s3cmd is a tool used for managing buckets and objects in Amazon S3 (Simple Storage Service). s3cmd is our suggested tool for operations such as listing buckets, managing bucket permissions, synchronizing directories with s3 buckets, and for small periodic file transfers. If high-speed transfer of a large files is required, we recommend using [s5cmd](#s5cmd). See the [preceding section](#command-line) for instructions on how to install both it and s5cmd into an Anaconda environment. +s3cmd is a tool used for managing buckets and objects in Amazon S3 (Simple Storage Service). s3cmd is our suggested tool for operations such as listing buckets, managing bucket permissions, synchronizing directories with s3 buckets, and for small periodic file transfers. If high-speed transfer of a large files is required, we recommend using [s5cmd](#s5cmd). See the [preceding section](#command-line) for instructions on how to install both it and s5cmd into a `conda` environment. #### Configuring `s3cmd` @@ -178,7 +178,7 @@ s3cmd info s3:// ### `s5cmd` -s5cmd is a parallel transfer tool suggested for period transfers of large and/or many files at a time. It has options for customizing how many processors are available for transferring data as well as how many chunks files can be broken into during transfer to minimize transfer time. See the [preceding section](#command-line) for instructions on how to install both it and s3cmd into an Anaconda environment +s5cmd is a parallel transfer tool suggested for period transfers of large and/or many files at a time. It has options for customizing how many processors are available for transferring data as well as how many chunks files can be broken into during transfer to minimize transfer time. See the [preceding section](#command-line) for instructions on how to install both it and s3cmd into a `conda` environment. #### Configuring `s5cmd` @@ -259,9 +259,9 @@ By default, `s5cmd` uses the `[default]` profile. To use a different profile, sp Replace `` with the profile name defined for your shared LTS allocation in your `~/.aws/credentials` file. -### Installation of `s3cmd` and `s5cmd` on Individual Systems Without Anaconda +### Installation of `s3cmd` and `s5cmd` on Individual Systems Without Conda -The installation instructions and software dependencies may differ depending on the operating system being used. Following are the installation instructions tested for different operating systems. You may also use [Anaconda](../../workflow_solutions/using_anaconda.md) to install either or both packages. +The installation instructions and software dependencies may differ depending on the operating system being used. Following are the installation instructions tested for different operating systems. You may also use [`conda`](../../workflow_solutions/using_conda.md) to install either or both packages. #### Ubuntu diff --git a/docs/data_management/lts/tutorial/individual_lts_tutorial.md b/docs/data_management/lts/tutorial/individual_lts_tutorial.md index 3bb2ce38b..010721ef4 100644 --- a/docs/data_management/lts/tutorial/individual_lts_tutorial.md +++ b/docs/data_management/lts/tutorial/individual_lts_tutorial.md @@ -15,13 +15,13 @@ You will also need an individual LTS allocation created by our team. If you beli ### Install `s3cmd` Within Conda Environment on Cheaha -To interact with LTS (Long-Term Storage) using [S3 (Simple Storage Service)](https://aws.amazon.com/s3/), you need the `s3cmd` tool installed.[`s3cmd`](https://s3tools.org/s3cmd) is a command-line tool for managing files in cloud storage systems like S3. It's recommended to install it using `pip`, the standard package installer for Python, which allows you to install packages from the [Python Package Index (PyPI)](https://pypi.org/), within a [Conda environment](../../../workflow_solutions/using_anaconda.md#create-an-environment) on Cheaha. +To interact with LTS (Long-Term Storage) using [S3 (Simple Storage Service)](https://aws.amazon.com/s3/), you need the `s3cmd` tool installed.[`s3cmd`](https://s3tools.org/s3cmd) is a command-line tool for managing files in cloud storage systems like S3. It's recommended to install it using `pip`, the standard package installer for Python, which allows you to install packages from the [Python Package Index (PyPI)](https://pypi.org/), within a [`conda` environment](../../../workflow_solutions/using_conda.md#create-an-environment) on Cheaha. Please avoid using `conda install s3cmd`, as that version will not work as expected. Instead, follow the steps below to install `s3cmd` using `pip` within your Conda environment. -First, access our interactive Open OnDemand (OOD) portal at and create a job on Cheaha using one of our interactive applications. For guidance, refer to our tutorial on [installing and setting Conda environment](../../../cheaha/tutorial/pytorch_tensorflow.md#installing-anaconda-environments-using-the-terminal). +First, access our interactive Open OnDemand (OOD) portal at and create a job on Cheaha using one of our interactive applications. For guidance, refer to our tutorial on [installing and setting Conda environment](../../../cheaha/tutorial/pytorch_tensorflow.md#installing-conda-environments-using-the-terminal). -Once your interactive apps session is launched, open the terminal as described in [step 5 of the Anaconda tutorial page](../../../cheaha/tutorial/pytorch_tensorflow.md#installing-anaconda-environments-using-the-terminal) and run the below commands. +Once your interactive apps session is launched, open the terminal as described in [step 5 of the Anaconda tutorial page](../../../cheaha/tutorial/pytorch_tensorflow.md#installing-conda-environments-using-the-terminal) and run the below commands. ```bash module load Anaconda3 @@ -36,7 +36,7 @@ Once these steps are completed, verify the installation by running `pip list | g ### Install `s3cmd` on Your Local Systems -To install s3cmd on your local machine, please follow the instructions provided in [our s3cmd documentation for local installation](../../../data_management/lts/interfaces.md#installation-of-s3cmd-and-s5cmd-on-individual-systems-without-anaconda). +To install s3cmd on your local machine, please follow the instructions provided in [our s3cmd documentation for local installation](../../../data_management/lts/interfaces.md#installation-of-s3cmd-and-s5cmd-on-individual-systems-without-conda). ### Configuring `s3cmd` for LTS Buckets diff --git a/docs/education/case_studies.md b/docs/education/case_studies.md index ed19cbb77..d85b629a0 100644 --- a/docs/education/case_studies.md +++ b/docs/education/case_studies.md @@ -60,7 +60,7 @@ To install Parabricks using Singulairty, load the `Singularity 3.x` module from module load Singularity/3.5.2-GCC-5.4.0-2.26 ``` -Go to the NGC catalog page and copy the image path to pull the desired containers of Parabricks using Singularity. Here, the generic container is pulled using Singularity. The image path is in “nvcr.io/nvidia/clara/clara-parabricks" and the tag is 4.2.0-1. The container image name `parabricks-4.2.0-1.sif` is an user-derived name. +Go to the NGC catalog page and copy the image path to pull the desired containers of Parabricks using Singularity. Here, the generic container is pulled using Singularity. The image path is in "nvcr.io/nvidia/clara/clara-parabricks" and the tag is 4.2.0-1. The container image name `parabricks-4.2.0-1.sif` is an user-derived name. ![!Parabricks container.](./images/parabricks_container.png) diff --git a/docs/grants/res/uab-rc-facilities.txt b/docs/grants/res/uab-rc-facilities.txt new file mode 100644 index 000000000..b304f8129 --- /dev/null +++ b/docs/grants/res/uab-rc-facilities.txt @@ -0,0 +1,100 @@ +UAB IT Research Computing (UABRC) Resources and Cybersecurity Facilities Document +Last Updated 2023-10-01 + + +INTRODUCTION + +UAB IT Research Computing (UABRC) maintains the Research Computing System (RCS), an integrated computing platform that provides access to enhanced compute, storage, and network capacity for UAB investigators and their collaborators. The RCS compute resources include a high-performance compute (HPC) cluster for large scale modeling and analysis workloads, an on-site cloud platform for highly customizable virtual machine (VM) based workloads, and a container orchestration platform for cloud-native workloads. The RCS storage resources include a high-speed GPFS file system attached to the HPC cluster to support data analysis workloads and large-capacity block and object storage to support the cloud and container workloads. The RCS networking infrastructure connects the computing and storage systems via 200Gbps Ethernet interconnect that provides ample capacity for data access and movement between the compute and storage resources. RCS networking also includes connectivity to the UAB Campus Network, a dedicated EDR/HDR Infiniband fabric for low-latency data exchange on the HPC cluster, and a 40Gbps ScienceDMZ for high-speed data transfers with national research and education networks. + +These RCS resources combine to provide a low-friction application hosting platform that enables research teams to build and deploy preferred tools without enforced refactoring to adopt applications to campus resources. The RCS resources are deployed spanning two data centers. The on-campus data center is in the recently constructed Technology Innovation Center (TIC). The off-campus data center is located at a nearby (less than 10km) commercial facility operated by DC BLOX, a regional data center provider offering a Tier III colocation facility in Birmingham with adequate power to support the high-density power requirements for expanding the compute capacity of the RCS and a resilient physical infrastructure designed to withstand natural disasters like tornados, which are common in the region. The commercial facility is connected to the campus data center via a dedicated, diverse fiber path lit with dual 100Gbps optics that leverages the University of Alabama System Regional Optical Network (UASRON). UABRC designs and maintains the RCS resources in open collaboration with the campus research community to ensure that the system addresses research needs and has the capacity to meet research demand. To support the growth represented by on-going research activities, major updates to the RCS platform are planned over 2023 and 2024 that include upgrades to the GPU resources on the HPC cluster, upgrades to the inter-data center network to dual 400Gbps optics, the addition of four 34kW rack leases at DC BLOX, and upgrades to the HPC compute resources and GPFS file system. + + + +HIGH-PERFORMANCE COMPUTING (Cheaha) RESOURCES + +Cheaha is a campus high-performance computing (HPC) resource dedicated to enhancing research computing productivity at UAB. Cheaha is managed by UAB IT Research Computing (UABRC) and is available to members of the UAB community in need of increased computational capacity. Cheaha supports both high-performance computing (HPC) and high-throughput computing (HTC) paradigms. Cheaha provides 10752 CPU cores, 112 GPUs, and 88 TB of memory across 159 compute nodes interconnected via an EDR/HDR InfiniBand network, providing over 1.1 PFLOP/s of aggregate theoretical peak performance. A high-performance, 7 PB GPFS parallel filesystem running on DDN SFA14KX hardware is connected to these HPC compute nodes via the InfiniBand fabric. + +Cheaha provides researchers with both a web-based interface, via open OnDemand, and a traditional command-line interactive environment, via SSH. These interfaces provide access to many scientific tools that can leverage a dedicated pool of on-premises compute resources via the SLURM batch scheduler. The on-premises compute pool provides access to four recent generations of hardware based on the x86 64-bit architecture. Gen7 (installed 2017) is composed of 18 nodes: 2x14 core (504 cores total) 2.4 GHz Intel Xeon E5-2680 v4 compute nodes with 256 GB RAM, four NVIDIA Tesla P100 16 GB GPUs per node, and an EDR InfiniBand interconnect. Gen8 (2019) is composed of 35 nodes with EDR InfiniBand interconnect: 2x12 core (840 cores total) 2.60 GHz Intel Xeon Gold 6126 compute nodes with 21 compute nodes at 192 GB RAM, 10 nodes at 768 GB RAM and 4 nodes at 1.5 TB RAM. Gen9 (Q2 2021) is composed of 52 nodes with EDR InfiniBand interconnect: 2x24 core (2496 cores total) 3.0 GHz Intel Xeon Gold 6248R compute nodes each with 578 GB RAM. Gen10 (Q4 2021) is composed of 34 nodes with EDR InfiniBand interconnect: 2x64 core (4352 cores total) 2.0 GHz AMD Epyc 7713 Milan compute nodes each with 512 GB RAM. Gen11 (Q4 2023) is composed of 20 nodes with EDR InfiniBand interconnect: 2x64 core (2560 cores total) 2.0 GHz AMD Epyc 7713 Milan compute nodes each with 512 GB RAM and 2 A100 80 GB GPUs. The compute nodes combine to provide over 1.1 PFLOP/s of dedicated computing power. In Q1 2024 the GPFS file system will be upgraded from version 4 to version 5 and migrated to a new hardware platform with expanded SSD capacity to improve data loading performance and enable expanded capacity through policy-based integration of object stores. + + + +HIGH-PERFORMANCE COMPUTING (Cheaha) SOFTWARE TOOLS + +General research computing and scientific programming software are available on Cheaha, including conda, R and RStudio, and MATLAB through the Lmod environment module system. RStudio, MATLAB, Jupyter Notebook server, and Jupyter Lab are all available on our Open OnDemand web portal as interactive applications, along with a general-use desktop environment via no-VNC, directly in the browser. Researchers are enabled to develop and share their own custom interactive applications through a sandbox application feature within Open OnDemand. + +The UAB Center for Clinical and Translational Science (CCTS) Informatics group has installed and supports a variety of bioinformatics tools that are available to be run from Cheaha. Standalone packages are available for quality control (fastQC, Picard Tools), alignment (Abyss, Velvet, BWA, Bowtie) visualization (IGV), variant calling (GATK, SnpEff, annoVar), RNAseq (Cufflinks, Cuffdiff, TopHat) and microbiome and metagenomic analysis (QIIME, HUMAnN, MEGAN). + +Additional scientific domain-specific software is also available, including Relion for cryo-electron microscopy analysis, AFNI for fMRI analysis, and ANSYS for simulations for research efforts of the UAB School of Engineering. Many other software packages are installed and maintained, and we encourage and facilitate researchers installing their own additional software using conda, R and MATLAB package management where possible. + + + +ON-PREMISES CLOUD (cloud.rc) RESOURCES + +UABRC operates a production private cloud called cloud.rc based on OpenStack, which echoes the research workload support goals of the NSF’s Jetstream2 resource part of the ACCESS network. The Cloud.RC platform has been used to support application development and DevOps processes to research labs across campus and is increasingly being leveraged to support many aspects of research IT operations. This fabric is composed of five Dell R640 48 core 192 GB RAM compute nodes for 240 cores and 960 GB of standard cloud compute resources. In addition, the fabric features four NVIDIA DGX A100 nodes that include 8 A100 GPUs and 1 TB of RAM each. This resource pool will be expanded with additional capacity via a backfill of the oldest generation of HPC nodes (gen 7). These resources are available to the research community for provisioning on demand via the OpenStack services (Ussuri release). The production implementation further supports researchers by making their hosted services available beyond campus, while adhering to standard UAB Campus Network security practices. Scientific software developers have access to the full stack for limitless development opportunities, with a frictionless migration path to public cloud providers as needed for specific research projects. A Kubernetes environment was deployed in Q32022 to allow for development workflows using containers. The compute resources of the Kubernetes environment are a duplicate of the cloud resources. The OpenStack and Kubernetes resources are deployed via Canonical’s Charmed operations stack enabling node migration between platforms in response to capacity demand. + + + +STORAGE RESOURCES + +The compute nodes on Cheaha are backed by high-performance, 7 PB GPFS storage on DDN SFA14KX hardware connected via an EDR /FDR InfiniBand fabric. Two additional storage systems are available to support research operations and application design. They are based on the Ceph storage technology and provide different hardware configurations to address different usage scenarios. The fabrics include a 6.9 PB archive storage fabric for long term storage (LTS) built using 12 Dell DSS7500 nodes, and an 11 PB nearline storage fabric built with 14 Dell R740xd2 nodes and 248 TB SSD cache storage fabric (Q32023) built with 8 Dell 840 nodes. These storage systems provide block and object storage services to the OpenStack and Kubernetes platforms. Additionally, the object storage services are empowering research applications with cloud-native data management and availability capabilities. + + + +NETWORK RESOURCES + +The RCS networking infrastructure connects the on- and off-campus computing and storage systems via 200Gbps Ethernet interconnect that provides capacity for data access and movement between the compute and storage resources. RCS networking also includes a dedicated EDR/HDR Infiniband fabric for the HPC platform and a 40Gbps ScienceDMZ for high-speed data transfers with national research and education networks. The ScienceDMZ supports direct connection to campus and high-bandwidth regional networks via 40 gb/s Globus Data Transfer Nodes (DTNs) providing the capability to connect data intensive research facilities directly with the high-performance computing and storage services of the RCS. This network can support very high-speed secure connectivity between nodes connected to it for high-speed file transfer of very large data sets without the concerns of interfering with other traffic on the campus backbone, ensuring predictable latencies. The Science DMZ interface with (DTNs) includes Perfsonar measurement nodes and a Bro security node connected directly to the border router that provide a "friction-free" pathway to access external data repositories as well as computational resources. + +The UAB Campus Network backbone is based on a 40 gb/s redundant Ethernet network with 480 gb/s back-planes on the core L2/L3 Switch/Routers. For efficient management, a collapsed backbone design is used. Each campus building is connected using 10 gb/s ethernet links over single mode optical fiber. Desktops are connected at 1 gb/s speed. The campus wireless network blankets classrooms, common areas and most academic office buildings. UAB connects to the Internet2 high-speed research network via the University of Alabama System Regional Optical Network (UASRON), a University of Alabama System owned and operated DWDM Network offering 100 gb/s ethernet to the Southern Light Rail (SLR)/Southern Crossroads (SoX) in Atlanta, Ga. The UASRON also connects UAB to UA, and UAH, the other two University of Alabama System institutions, and the Alabama Supercomputer Center. UAB is also connected to other universities and schools through Alabama Research and Education Network (AREN). + + + +DATA MANAGEMENT AND TRANSFER RESOURCES + +A traditional POSIX filesystem is available on all Cheaha HPC nodes through GPFS for data requiring computational analysis, with separate data, scratch, and shared storage. Object storage is available via our Long-Term Storage (LTS) S3 interface. A REST endpoint is provided for LTS and exposed to the Internet to facilitate hosting research data products for external use. Block storage is available to support storage needs for our cloud and Kubernetes fabrics. + +All faculty, staff and students who create a Research Computing account have immediate access to 5 TB of personal GPFS storage and may request an additional 5 TB of LTS storage. Research PI groups, Core facilities, and other research groups at UAB may request up to 25 TB of GPFS storage and 75 TB of LTS storage for shared collaboration spaces. + +Globus High-Assurance (HA) endpoints are maintained on the RCS platform to facilitate internal and external data transfers. Connectors to our enterprise Box.com instance and our LTS S3 interface are made available as part of our Globus subscription. A controlled-access Science DMZ partition of our hardware is available to facilitate high-throughput, parallel batch data transfers over the available 40 gb/s connection to the external internet. Standard data transfer software such as Rclone, AWSCLI and s3cmd, and UAB-specific documentation and support, are provided to researchers to facilitate data transfers whenever Globus is infeasible. + + + +FACILITATION, OUTREACH, DOCUMENTATION, AND SUPPORT + +UABRC provides research computation facilitation services for researchers using RCS. These services exist to reduce friction for researchers seeking to scale workflows from desktop and workstation scale up to HPC scale. Part of facilitation serves include computational outreach efforts within UAB, including facilitating lesson design for courses making use of our platform, teaching a Data Science Journal Club course, providing how-to-use-HPC lessons at University events, and proactively identifying opportunities for education and efficiency improvements using our internal observability stack. + +Extensive documentation of computational capabilities, good practices for system use, references and tutorials are all available on our documentation website, publicly available on the Internet. Links to common, standard external educational resources are provided and encouraged where applicable, including to, the Software Carpentries and Data Carpentries lessons, and GitHub and GitLab version control documentation. + +We provide coverage for support requests during standard business hours, and greater coverage for outages and security incidents. Tickets are tracked using ServiceNow software, and most requests are addressed within 1-2 business days, with faster responses for critical incidents. Support requests covered include software installation and update requests, new researcher training, hardware and software errors, data transfer and shared storage requests, and facilitation of collaborative, grant-funded research computation projects ranging from individuals, through labs, to core facilities. + + + +RESEARCH COMPUTING PERSONNEL + +UAB IT Research Computing currently maintains a support staff of 13 led by the Assistant Vice President for Research Computing and includes one HPC Architect-Manager, one Research Facilitation and Data Manager, four Software developers, two Research Facilitation Scientists, three system administrators and a project coordinator. + + + +UAB CYBERSECURITY POLICIES AND PRACTICES + +UAB IT maintains a unified and comprehensive privacy and information security program that preserves and protects the confidentiality, availability and integrity of all information assets including patients, research, customer, and business data. The integrated security program upholds values and provides high standards of service, trust, confidentiality and responsiveness to patients, customers, employees, and business associates. The IT Security program encompasses regulatory requirements in a practical approach that preserves and protects the confidentiality, availability and integrity of information assets including patient, customer, and business data. + +The UAB IT security program includes the following: + +- IT security policies designed to ensure a secure state of operations and information management. +- Technical security standards that document baseline security requirements for key technologies and platforms such as major operating systems, databases, network device operating systems, firewalls, web-server security, email, encryption, secure file transfer protocols, virus defense, media reuse and media disposal. +- A comprehensive qualitative risk management program. +- A computer security incident response plan that is supported by cross-functional response and recovery teams. +- User system access that is tightly controlled and meets standards required by various regulations and accrediting agencies including HIPAA, JCAHO and CAP. Two-factor authentication is utilized for access to all shared systems. Users must agree to maintain password confidentiality, log-off terminals at the end of each user session and alert management when security violations become known. We also routinely demonstrate compliance with Federal granting agencies such as the NIH, FISMA and and the VA and the corresponding security requirements. +- An Institutional Firewall for perimeter and layered protection. +- Network Intrusion Detection Systems (NIDS) have been strategically deployed to continuously monitor Internet, Extranet and Internal communications. +- Zero Day Desktop and Server Centralized Microsoft Patch Update Services. +- A centralized firewall protected Email Microsoft Exchange system with Spam Scoring and Virus Scanning. +- 168bit 3DES encrypted IPSec tunnels for business associates, staff remote access, or partner VPN connectivity, +- Capability to support encrypted secure file transfers. +- Virus protection agents and comprehensive patch management programs installed on all computer workstations and servers to protect against malware infections. +- PGP whole disk encryption software is required for all portable and high-risk devices +- In-depth security training is provided for all faculty, staff and students. UAB has an extensive infrastructure to secure HIPAA-defined Electronic Protected Health Information (ePHI) from its creation and throughout its lifecycle. Secure web portals are utilized to make the required information accessible only to those who need access. The existing wireless infrastructure and secure VLAN architecture make the required ePHI portable but secure. Portable devices do not cache the data local to the device and transmissions are encrypted. + +UAB applications are designed and developed using a comprehensive set of security standards. Areas addressed within application security standards include password construction, strength and control, browser technologies authentication and access control, security administration, and logging, auditing, and monitoring. + +Internet applications mandate TLS encryption with strong cipher suites for the transmission of any sensitive data. Before going into production, all new Internet applications must be submitted for security testing. All identified security issues that could impact the confidentiality or integrity of our data must be corrected prior to production release. Applications are retested on a regular schedule that coincides with major release cycles. A comprehensive change management system is utilized for updates, production changes, quality control and revision management. diff --git a/docs/help/support.md b/docs/help/support.md index 6c5bc0743..eb789a5d8 100644 --- a/docs/help/support.md +++ b/docs/help/support.md @@ -86,7 +86,7 @@ Please see our [Storage page](../data_management/index.md) for more information. ## How Do I Request New Software Installed? -Before making a request for new software on Cheaha, please try searching our [modules](../cheaha/software/modules.md) or searching for packages on [Anaconda](../workflow_solutions/using_anaconda.md). +Before making a request for new software on Cheaha, please try searching our [modules](../cheaha/software/modules.md) or searching for packages on [`conda`](../workflow_solutions/using_conda.md). If you are not able to find a suitable module or package and would like software installed on Cheaha, please [create a ticket](#how-do-i-create-a-support-ticket) with the name of the software, the version number, and a link to the installation instructions. diff --git a/docs/index.md b/docs/index.md index 4ef167124..edb0a41ce 100644 --- a/docs/index.md +++ b/docs/index.md @@ -5,9 +5,11 @@ The Research Computing System (RCS) provides a framework for sharing research da ## News -!!! announcement +!!! important - Research Computing will be performing a data migration for Cheaha beginning Saturday, October 11 and lasting through the end of the month. Please see our [migration overview](./news/posts/2025-10-07-migration-overview.md) for more information on what to expect during this time. + Anaconda has changed its [Terms of Service]. As a consequence, UAB employees are no longer allowed to use the Anaconda Distribution and Anaconda channels within the `conda` software. We are in the process of replacing Anaconda on Cheaha with Miniforge. + + Read more about how this may affect you at our [Conda Migration FAQ](workflow_solutions/conda_migration_faq.md). **Check our [News page](./news/index.md) for recent developments.** diff --git a/docs/uab_cloud/installing_software.md b/docs/uab_cloud/installing_software.md index 88091a44d..b61818427 100644 --- a/docs/uab_cloud/installing_software.md +++ b/docs/uab_cloud/installing_software.md @@ -42,7 +42,7 @@ Most common software packages and NVIDIA drivers are available as `apt` packages If the software is available via `apt` then use `sudo apt install `. An example would be `sudo apt install git` to install git software. -If the software uses a custom installer, then follow the instructions provided by the software's documentation. An example would be [Miniconda](#installing-miniconda), where a shell script is downloaded and then executed using `bash installer.sh`. +If the software uses a custom installer, then follow the instructions provided by the software's documentation. An example would be [Miniforge](#installing-conda-via-miniforge), where a shell script is downloaded and then executed using `bash installer.sh`. ### Installing Server Software @@ -137,13 +137,17 @@ Below are a few examples of installing certain common softwares that may be usef 1. Find the line with "recommended" and install the package on that line with `sudo apt install nvidia-driver-###` 1. Reboot the instance -#### Installing Miniconda +#### Installing Conda via Miniforge -Miniconda is a lightweight version of Anaconda. While Anaconda's base environment comes with Python, the Scipy stack, and other common packages pre-installed, Miniconda comes with no packages installed. This is an excellent alternative to the full Anaconda installation for environments where minimal space is available or where setup time is important. We recommend installing [Miniconda](https://docs.conda.io/en/latest/miniconda.html) on cloud.rc instances, as opposed to Anaconda, to conserve storage space. For more information on how to use Anaconda see the [Using Anaconda](../workflow_solutions/using_anaconda.md#using-anaconda). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using Anaconda in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). +Miniforge is a free and open-source (FOSS) version of Anaconda. If you are a UAB employee, do not use Anaconda or Miniconda. See our [Conda Migration FAQ](../workflow_solutions/conda_migration_faq.md) to understand why. + +For more information on how to use `conda` see the [Using `conda` page](../workflow_solutions/using_conda.md#using-conda). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using `conda` in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). + +To install Miniforge in your [instance](tutorial/instances.md) 1. Run the commands in [Before Installing Software](#before-installing-software). -1. `wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh` -1. `bash Miniconda3-latest-Linux-x86_64.sh` +1. `wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh` +1. `bash Miniforge3-Linux-x86_64.sh` #### Installing Singularity @@ -207,9 +211,9 @@ To install, you will need the following pre-requisites. If you are unfamiliar wi 1. Run the commands in [Before Installing Software](#before-installing-software). 1. A [Cloud Instance](tutorial/instances.md) with attached [Floating IP]network_setup_basic.md#floating-ips). 1. A [Security Group](tutorial/security.md#creating-a-security-group) for the intended Jupyter Server port. For the purposes of this tutorial, the port will be set to `9999`. -1. [Miniconda installed](#installing-miniconda) on the instance. Miniconda is a lightweight version of Anaconda. +1. [`conda` installed](#installing-conda-via-miniforge) on the instance. -Once the prerequisites are complete, the following steps must be performed to install and setup Jupyter Notebook Server. It is highly recommended to build an [Anaconda Environment](../workflow_solutions/using_anaconda.md#create-an-environment) using a reproducible [Environment File](../workflow_solutions/using_anaconda.md#creating-an-environment-from-a-yaml-file). The steps below belong to the official Jupyter documentation available at . +Once the prerequisites are complete, the following steps must be performed to install and setup Jupyter Notebook Server. It is highly recommended to build a [`conda` Environment](../workflow_solutions/using_conda.md#create-an-environment) using a reproducible [Environment File](../workflow_solutions/using_conda.md#creating-an-environment-from-a-yaml-file). The steps below belong to the official Jupyter documentation available at . !!! warning @@ -217,16 +221,13 @@ Once the prerequisites are complete, the following steps must be performed to in Leaving your Jupyter Notebook Server unsecured may mean that other people on the UAB Campus Network are able to access your notebooks and other files stored on that cloud instance. -1. [Install](../workflow_solutions/using_anaconda.md#install-packages) Jupyter Notebook Server using [Miniconda](../workflow_solutions/using_anaconda.md). You will need the following packages. +1. [Install](../workflow_solutions/using_conda.md#install-packages) Jupyter Notebook Server using [`conda`](../workflow_solutions/using_conda.md). You will need the following packages. - - `conda-forge` channel - - `notebook` - - `nb_conda_kernels` - - [Optional] `jupyter_contrib_nbextensions` - - `anaconda` channel - - `ipykernel` for python users - - `r-irkernel` for R users - - [Optional] `pip` + - `notebook` + - `nb_conda_kernels` + - `ipykernel` for python users + - `r-irkernel` for R users + - [Optional] `jupyter_contrib_nbextensions` 1. Because floating IPs are, by default, reachable by anyone on the UAB Campus Network, you'll need to secure the server using the steps below. 1. Generate a notebook config file using `jupyter notebook --generate-config`. [[official docs](https://jupyter-server.readthedocs.io/en/stable/operators/public-server.html#prerequisite-a-jupyter-server-configuration-file)] diff --git a/docs/workflow_solutions/conda_migration_faq.md b/docs/workflow_solutions/conda_migration_faq.md new file mode 100644 index 000000000..3f23e11d8 --- /dev/null +++ b/docs/workflow_solutions/conda_migration_faq.md @@ -0,0 +1,83 @@ +# Conda Migration FAQ + +## Why Do I Need to Stop Using Anaconda? + +In April, 2020, Anaconda changed from a free-for-everyone licensing model to a free-for-some licensing model. At that time, Anaconda was free to use by individuals for personal use, non-profit organizations of any size (including UAB), and for-profit organizations up to 200 employees. + +In March, 2024, Anaconda further restricted its licensing model. Anaconda is now free to use only for individuals for personal use, organizations up to 200 employees, and non-profit educational organizations when used by instructos and students in a curriculum-based course. + +Use of Anaconda by UAB employees for research purposes violates the Anaconda Terms of Service. + +## What Counts as "Use of Anaconda"? + +- Downloading and installing Anaconda Software Distributions, including `anaconda` and `miniconda`. +- Using the `defaults`, `anaconda`, and `r` channels for packages. +- Using Anaconda Navigator. + +Using the `conda` executable does not violate the terms of service, provided it is not used to access the channels listed above. + +## What Is Changing on Cheaha? + +We have installed Minforge as a module. To use it, **run `module load Miniforge3`** wherever you would have used `module load Anaconda3`. At a future date, we plan to archive old `Anaconda3` modules and alias the most recent on to the `Miniforge3`. When that has been completed, `module load Anaconda3` will emit a warning and then load the `Miniforge3` module instead. There will be ample notice as we roll out this change. + +## Do I Need to Learn Any New Technologies? + +No. However, to avoid violating the Anaconda Terms of Service, there are some actions you will need to take. + +## Does This Impact My UAB Owned Laptop, Desktop, Workstation, or Server? + +Yes. If you are currently using Anaconda channels or any part of the Anaconda Distribution for work purposes as an employee of UAB, then that use is in violation of the Anaconda Terms of Service, regardless of the device or computer. + +To remedy this situation, you will need to transition from Anaconda to Miniforge on the affected machines. For UAB managed machines, please contact your IT representatives for assistance with this process. + +## What Do I Need to Do to Avoid Violating the Terms of Service on Cheaha? + +- Replace `module load Anaconda3` with `module load Miniforge3` in your current projects. +- Remove `defaults`, `anaconda`, and `r` from your channel lists in environment YAML definition files. +- Stop using the `defaults`, `anaconda`, and `r` channels in `conda` commands. + - Avoid `-c defaults`, `-c anaconda`, and `-c r` as part of `conda install` commands. + - Avoid `conda install defaults::$package`, `... anaconda::$package`, and `... r::$package`. +- If you encounter any errors building environments, please contact support. + +## How Can I Migrate My Existing Environments? + +- Export existing environments using `conda env export --name $env_name > $env_name.yml` to produce a written record of the environment packages. +- Open the `$env_name.yml` file in a text editor +- Using the text editor, remove the lines under `channels:` that read `- anaconda`, `- defaults`, and `- r`. +- Save the file. +- Reinstall the environment with Miniforge using the command `conda env create --file $env_name.yml`. + +If you encounter any errors please contact support. + +## How Can I Install a New Environment From a File? + + +!!! danger + + Only install environments from files coming from sources you trust. + + +- Obtain a copy of the file from its original source. +- Open the `$env_name.yml` file in a text editor +- Using the text editor, remove the lines under `channels:` that read `- anaconda`, `- defaults`, and `- r`. +- Save the file. +- Install the environment with Miniforge using the command `conda env create --file $env_name.yml`. + +If you encounter any errors please contact support. + +## What Are Good Practices to Minimize Impacts in the Future? + +- Record your packages and versions in environment YAML files to make your environments reproducible. `` +- Store your environment YAML files in a git repository on GitHub or GitLab to make your environments shareable and collaborative. `` + +## What Do I Do if I Use Anaconda Navigator to Build Environments? + +At this time there does not appear to be a free-to-use alternative to Anaconda Navigator. You will need to use the terminal to create and manage environments. We have a tutorial and ample documentation covering this ``. If you would like further assistance, please contact support. + +## What Do All of the Terms Relating to Conda Mean? + +- **Anaconda** - An ambiguous term that may refer to the company, its package distribution channels, or its software distribution. Sometimes used to reference the package management software `conda`, though this is not correct. +- **Anaconda Inc.** - The for-profit company that created the well-known ecosystem for scientific python packages. Website: +- **Anaconda Distribution** - The system owned and maintained by Anaconda Inc. that distributes software packages through the `conda` software. +- **`anaconda` channel** - A channel for delivering packages owned and maintained by Anaconda Inc. that is subject to the Anaconda Terms of Service. +- **`conda`** - The software used to manage environments and install packages from the Anaconda Distribution. diff --git a/docs/workflow_solutions/creating_sandbox_apps.md b/docs/workflow_solutions/creating_sandbox_apps.md index 95c575f71..ef4574bf4 100644 --- a/docs/workflow_solutions/creating_sandbox_apps.md +++ b/docs/workflow_solutions/creating_sandbox_apps.md @@ -23,7 +23,7 @@ Create a dev folder in your `$USER/ondemand` folder by following these steps: 1. Create an HPC Desktop Interactive Job on Cheaha. [Please see our detailed guide](../cheaha/open_ondemand/ood_layout.md#creating-an-interactive-job) for details on how to do this. -1. Access the terminal within your HPC Desktop Job. [Please see our guide on how to access the terminal](../cheaha/tutorial/pytorch_tensorflow.md#installing-anaconda-environments-using-the-terminal) for details on how to do this. +1. Access the terminal within your HPC Desktop Job. [Please see our guide on how to access the terminal](../cheaha/tutorial/pytorch_tensorflow.md#installing-conda-environments-using-the-terminal) for details on how to do this. ![Terminal button from Interactive sessions page on Cheaha](images/cheaha_sandbox_shell_button.png) diff --git a/docs/workflow_solutions/getting_containers.md b/docs/workflow_solutions/getting_containers.md index 6ecc2f98a..f757b256d 100644 --- a/docs/workflow_solutions/getting_containers.md +++ b/docs/workflow_solutions/getting_containers.md @@ -214,7 +214,7 @@ We require numpy, scipy, and matplotlib libraries to execute the above Python sc You may specify the required version from the `Tag` list. Here the tag/version is `4.12.0`. Also its a very good practice to specify the version of packages for numpy, scipy, and matplotlib for better reproducibility. !!! note "Containers and Reproducibiliy" - Always include version numbers for Anaconda, package managers, software you are installing, and the dependencies for those software. Containers are not by nature scientifically reproducible, but if you include versions for as much software in the container as possible, they can be reproducible years later. + Always include version numbers for `conda`, package managers, software you are installing, and the dependencies for those software. Containers are not inherently scientifically reproducible, but they can be made reproducible for years if you include versions for as much software in the container as possible. 1. To build your container, change the directory to `miniconda` and use the below syntax to build the `Dockerfile`. Here we use `.` to say "current directory." This will only work if you are in the directory with the `Dockerfile`. diff --git a/docs/workflow_solutions/r_environments.md b/docs/workflow_solutions/r_environments.md index 29a1a87c3..e676dfb59 100644 --- a/docs/workflow_solutions/r_environments.md +++ b/docs/workflow_solutions/r_environments.md @@ -1,6 +1,6 @@ # R Projects and Environments -When working on multiple projects, it's likely that different sets of external analysis packages and their dependencies will be needed for each project. Managing these different projects is simple in something like [Anaconda](using_anaconda.md) by creating a different virtual environment for each project, but this functionality is not fully built into RStudio by default. +When working on multiple projects, it's likely that different sets of external analysis packages and their dependencies will be needed for each project. Managing these different projects is simple in something like [`conda`](using_conda.md) by creating a different virtual environment for each project, but this functionality is not fully built into RStudio by default. Instead, we suggest to take advantage of [R Projects](https://support.posit.co/hc/en-us/articles/200526207-Using-RStudio-Projects) and the [renv](https://rstudio.github.io/renv/articles/renv.html) package to keep environments separate for each project you start. diff --git a/docs/workflow_solutions/shell.md b/docs/workflow_solutions/shell.md index c8fcfae2c..47ad177ad 100644 --- a/docs/workflow_solutions/shell.md +++ b/docs/workflow_solutions/shell.md @@ -21,7 +21,7 @@ The internet has thousands of guides for using the shell. Rather than devise our There are also additional resources that aid in learning and verifying shell commands and scripts: - [Explain Shell](https://explainshell.com/): An educational tool providing detailed explanations of individual commands in relatively reasonably-plain English. This tool doesn't explain what a command does at a high level nor its purpose or intent, only the details of the parts making up the command. -- [ShellCheck](https://www.shellcheck.net/): An online tool for conducting static analysis checks on shell scripts. The Git repository for this tool can be found at and it can also be installed via [Anaconda]( https://anaconda.org/conda-forge/shellcheck). +- [ShellCheck](https://www.shellcheck.net/): An online tool for conducting static analysis checks on shell scripts. The Git repository for ShellCheck can be found at [GitHub](https://github.com/koalaman/shellcheck) and it can also be installed via [Conda](https://anaconda.org/conda-forge/shellcheck). At the shell prompt, you can also use the command `curl cheat.sh/` to get a simple-to-understand explanation of what the command does and how to use it (see [curl](#download-files-from-internet-sources-curl)). Below is an example for the [pwd command](#show-working-directory-pwd). diff --git a/docs/workflow_solutions/using_anaconda.md b/docs/workflow_solutions/using_anaconda.md deleted file mode 100644 index b62c2ac18..000000000 --- a/docs/workflow_solutions/using_anaconda.md +++ /dev/null @@ -1,408 +0,0 @@ -# Anaconda - -Python is a high level programming language that is widely used in many branches of science. As a result, many scientific packages have been developed in Python, leading to the development of a package manager called Anaconda. Anaconda is the standard in Python package management for scientific research. - -Benefits of Anaconda: - -- Shareability: environments can be shared via human-readable text-based YAML files. -- Maintainability: the same YAML files can be version controlled using git. -- Repeatability: environments can be rebuilt using those same YAML files. -- Simplicity: dependency matrices are computed and solved by Anaconda, and libraries are pre-built and stored on remote servers for download instead of being built on your local machine. -- Ubiquity: nearly all Python developers are aware of the usage of Anaconda, especially in scientific research, so there are many resources available for learning how to use it, and what to do if something goes wrong. - -Anaconda can also install Pip and record which Pip packages are installed, so Anaconda can do everything Pip can, and more. - - -!!! important - - If using Anaconda on Cheaha, please see our [Anaconda on Cheaha page](../cheaha/software/software.md#anaconda-on-cheaha) for important details and restrictions. - - -## Using Anaconda - -Anaconda is a package manager, meaning it handles all of the difficult mathematics and logistics of figuring out exactly what versions of which packages should be downloaded to meet your needs, or inform you if there is a conflict. - -Anaconda is structured around environments. Environments are self-contained collections of researcher-selected packages. Environments can be changed out using a simple package without requiring tedious installing and uninstalling of packages or software, and avoiding dependency conflicts with each other. Environments allow researchers to work and collaborate on multiple projects, each with different requirements, all on the same computer. Environments can be installed from the command line, from pre-designed or shared YAML files, and can be modified or updated as needed. - -The following subsections detail some of the more common commands and use cases for Anaconda usage. More complete information on this process can be found at the [Anaconda documentation](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using Anaconda in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). - - -!!! important - - If using Anaconda on Cheaha, please see our [Anaconda on Cheaha page](../cheaha/software/software.md#anaconda-on-cheaha) for important details and restrictions. - - -### Create an Environment - -In order to create a basic environment with the default packages, use the `conda create` command: - -```bash -# create a base environment. Replace with an environment name -conda create -n -``` - -If you are trying to replicate a pipeline or analysis from another person, you can also recreate an environment using a YAML file, if they have provided one. To replicate an environment using a YAML file, use: - -```bash -# replicate an environment from a YAML file named env.yml -conda create -n -f -``` - -By default, all of your conda environments are stored in `/home//.conda/envs`. - -### Activate an Environment - -From here, you can activate the environment using either `source` or `conda`: - -```bash -# activate the virtual environment using source -source activate - -# or using conda -conda activate -``` - -To know your environment has loaded, the command line should look like: - -```text -() [BlazerID@c0XXX ~]$ -``` - -Once the environment is activated, you are allowed to install whichever python libraries you need for your analysis. - -### Install Packages - -To install packages using Anaconda, use the `conda install` command. The `-c` or `--channel` command can be used to select a specific package channel to install from. The `anaconda` channel is a curated collection of high-quality packages, but the very latest versions may not be available on this channel. The `conda-forge` channel is more open, less carefully curated, and has more recent versions. - -```bash -# install most recent version of a package -conda install - -# install a specific version -conda install =version - -# install from a specific conda channel -conda install -c <=version> -``` - -Generally, if a package needs to be downloaded from a specific conda channel, it will mention that in its installation instructions. - -#### Installing Packages With Pip - -Some packages are not available through Anaconda. Often these packages are available via [PyPI](https://pypi.org/) and thus using the Python built-in Pip package manager. Pip may also be used to install locally-available packages as well. - - -!!! important - Make sure `pip` is installed within the `conda` environment, and only run pip within `conda` environments, to prevent hard-to-diagnose issues. - - See our [Pip Installs Packages Outside of Environment section](../cheaha/open_ondemand/ood_jupyter_notebook.md#pip-installs-packages-outside-of-environment) for more details and how to fix the issue. - - -```bash -# install most recent version of a package -pip install \ - -# install a specific version, note the double equals sign -pip install \==version - -# install a list of packages from a text file -pip install -r packages.txt -``` - - -!!! important - - If you see an output message like "Requirement already satisfied: $package in /home/$USER/.local/lib/python3.xx/site-packages". - - To fix the issue, please see our [Pip Installs Packages Outside of Environment section](../cheaha/open_ondemand/ood_jupyter_notebook.md#pip-installs-packages-outside-of-environment). - - -#### Finding Packages - -You may use the [Anaconda page](https://anaconda.org/) to search for packages on Anaconda, or use Google with something like ` conda`. To find packages in PyPI, either use the [PyPI page](https://pypi.org/) to search, or use Google with something like ` pip`. - -#### Jupyter Package Management - -For more information about using Anaconda with Jupyter Notebooks and JupyterLab, see the [Working with Anaconda Environments section](../cheaha/open_ondemand/ood_jupyter_notebook.md#working-with-anaconda-environments). - -#### CUDA and cuDNN Package for GPU Usage - -For more information about finding CUDA and cuDNN packages for use with GPUs, see the [CUDA and cuDNN Modules section](../cheaha/slurm/gpu.md#cuda-and-cudnn-modules) - -#### Performance Considerations for GPUs - -See our [Making the Most of GPUs section](../cheaha/slurm/gpu.md#making-the-most-of-gpus) for more information about maximizing the performance of GPUs on Cheaha. - -### Update Package in an Environment - -To ensure packages and their dependencies are all up to date, it is a best practice to regularly update installed packages, and libraries in your activated environment. - -```bash - -conda update -—all - -``` - -### Deactivating an Environment - -An environment can be deactivated using the following command. - -```bash -# Using conda -conda deactivate -``` - -Anaconda may say that using `source deactivate` is deprecated, but environment will still be deactivated. - -Closing the terminal will also close out the environment. - -### Deleting an Environment - -To delete an environment, use the following command. Remember to replace `` with the existing environment name. - -```bash - -conda env remove —-name - -``` - -### Working With Environment YAML Files - -#### Exporting an Environment - -To easily share environments with other researchers or replicate it on a new machine, it is useful to create an environment YAML file. You can do this using: - -```bash -# activate the environment if it is not active already -conda activate - -# export the environment to a YAML file -conda env export > env.yml -``` - -#### Creating an Environment From a YAML File - -To create an environment from a YAML file `env.yml`, use the following command. - -```bash -conda env create --file env.yml -``` - -#### Sharing Your Environment File - -To share your environment for collaboration, there are primarily 3 ways to export environments, the below commands show how to create environment files that can be shared for replication. Remember to replace `` with the existing environment name. - -1. Cross-Platform Compatible - - ```bash - - conda env export --from-history > .yml - - ``` - -1. Platform + Package Specific - - Create .yml file to share, replace `` (represents the name of your environment) and `` (represents the name of the file you want to export) with preferred names for file. - - ```bash - - conda env export > .yml - - ``` - -1. Platform + Package + Channel Specific - - ```bash - - conda list —-explicit > .txt - # OR - conda list —-explicit > .yml - - ``` - -#### Replicability Versus Portability - -An environment with only `python 3.10.4`, `numpy 1.21.5` and `jinja2 2.11.2` installed will output something like the following file when `conda env export` is used. This file may be used to precisely replicate the environment as it exists on the machine where `conda env export` was run. Note that the versioning for each package contains two `=` signs. The code like `he774522_0` after the second `=` sign contains hyper-specific build information for the compiled libraries for that package. Sharing this exact file with collaborators may result in frustration if they do not have the exact same operating system and hardware as you, and they would not be able to build this environment. We would say that this environment file is not very portable. - -There are other portability issues: - -- The `prefix: C:\...` line is not used by `conda` in any way and is deprecated. It also shares system information about file locations which is potentially sensitive information. -- The `channels:` group uses `- defaults`, which may vary depending on how you or your collaborator has customized their Anaconda installation. It may result in packages not being found, resulting in environment creation failure. - -```yaml -name: test-env -channels: - - defaults -dependencies: - - blas=1.0=mkl - - bzip2=1.0.8=he774522_0 - - ca-certificates=2022.4.26=haa95532_0 - - certifi=2021.5.30=py310haa95532_0 - - intel-openmp=2021.4.0=haa95532_3556 - - jinja2=2.11.2=pyhd3eb1b0_0 - - libffi=3.4.2=h604cdb4_1 - - markupsafe=2.1.1=py310h2bbff1b_0 - - mkl=2021.4.0=haa95532_640 - - mkl-service=2.4.0=py310h2bbff1b_0 - - mkl_fft=1.3.1=py310ha0764ea_0 - - mkl_random=1.2.2=py310h4ed8f06_0 - - numpy=1.21.5=py310h6d2d95c_2 - - numpy-base=1.21.5=py310h206c741_2 - - openssl=1.1.1o=h2bbff1b_0 - - pip=21.2.4=py310haa95532_0 - - python=3.10.4=hbb2ffb3_0 - - setuptools=61.2.0=py310haa95532_0 - - six=1.16.0=pyhd3eb1b0_1 - - sqlite=3.38.3=h2bbff1b_0 - - tk=8.6.11=h2bbff1b_1 - - tzdata=2022a=hda174b7_0 - - vc=14.2=h21ff451_1 - - vs2015_runtime=14.27.29016=h5e58377_2 - - wheel=0.37.1=pyhd3eb1b0_0 - - wincertstore=0.2=py310haa95532_2 - - xz=5.2.5=h8cc25b3_1 - - zlib=1.2.12=h8cc25b3_2 -prefix: C:\Users\user\Anaconda3\envs\test-env -``` - -To make this a more portable file, suitable for collaboration, some planning is required. Instead of using `conda env export` we can build our own file. Create a new file called `env.yml` using your favorite text editor and add the following. Note we've only listed exactly the packages we installed, and their version numbers, only. This allows Anaconda the flexibility to choose dependencies which do not conflict and do not contain unusable hyper-specific library build information. - -```yaml -name: test-env -channels: - - anaconda -dependencies: - - jinja2=2.11.2 - - numpy=1.21.5 - - python=3.10.4 -``` - -This is a much more readable and portable file suitable for sharing with collaborators. We aren't quite finished though! Some scientific packages on the `conda-forge` channel, and on other channels, can contain dependency errors. Those packages may accidentally pull a version of a dependency that breaks their code. - -For example, the package `markupsafe` made a not-backward-compatible change (a breaking change) to their code between `2.0.1` and `2.1.1`. Dependent packages expected `2.1.1` to be backward compatible, so their packages allowed `2.1.1` as a substitute for `2.0.1`. Since Anaconda chooses the most recent version allowable, package installs broke. To work around this for our environment, we would need to modify the environment to "pin" that package at a specific version, even though we didn't explicitly install it. - -```yaml -name: test-env -channels: - - anaconda -dependencies: - - jinja2=2.11.2 - - markupsafe=2.0.1 - - numpy=1.21.5 - - python=3.10.4 -``` - -Now we can be sure that the correct versions of the software will be installed on our collaborator's machines. - - -!!! note - - The example above is provided only for illustration purposes. The error has since been fixed, but the example above really happened and is helpful to explain version pinning. - - -#### Good Practice for Finding Software Packages on Anaconda - -Finding Anaconda software packages involves searching through the available “Channels” and repositories to locate the specific packages that contain functions that you need for your environment. Channels are Anaconda's way of organizing packages. Channels instruct Anaconda where to look for packages when installation is to be done. The following are Anaconda Channels that are readily used to house majority of the packages used in scientific research. Anaconda, Conda-Forge, BioConda, other Channels also exist. If you want more information on Anaconda Channels please see their [docs](https://www.anaconda.com/docs/main). - -In the sections below, you will see information on how to find key packages you intend to use, ensure the packages are up-to-date, figure out the best way to install them, and finally compose an environment file for portability and replicability. - -##### Step-by-Step Guide to Finding Anaconda Software Packages - -If we find the package at one of the Channel sources mentioned above, we can check the Platform version to ensure it is either "noarch" (if available) or linux. After noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on Anaconda or PyPI. If so, the package is being maintained on Anaconda/PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). We prefer searching using the following methods, and usually have the most success in the order listed below. - -- Using Google: You may already be familiar with the exact Anaconda package name you require. In the event this is not the case, a simple web engine search with key words usually finds the package. For example, a web search for an Anaconda package would be something along the lines of “Anaconda package for `Generic Topic Name`”. Your search results, should return popular package names related to the topic you have searched for. In the sections below, there is an attempt to provide a detailed step-by-step guide on how to find Anaconda packages using “numpy” as an example. - -- Anaconda Cloud: Anaconda Cloud is the primary source for finding Anaconda packages. You can visit [Anaconda Cloud](https://anaconda.org/) and use the search bar to find the package you need. For example, when you get the package name from your web search (using numpy). You will enter name of the package in the search bar as shown below. - -![!Landing page of anaconda.org showing search](images/anaconda_search.png) - -Review results of your search, it is advised to use “Artifacts” that are compatible with the platform you are working with, as well as have the most “Favorites” and “Downloads” numbers. Click on the portion that contains the name of the package (highlighted 3 in the image below). 1 highlights the Artifact, Favorite and Downloads numbers, the selection 2 highlights the Channel where this package is stored. - -![!Anaconda.org page showing download statistics](images/anaconda_channel_package.png) - -Follow the installation instructions you see in the image below. - -![!Anaconda.org page showing package installation instructions](images/install_anaconda_package.png) - -- Using the Conda Search Command: You can use the `conda search ` command directly in your terminal to find packages. Replace `` with the package you would like to search for. To do this on Cheaha, make sure to `module load Anaconda3` first, and follow the instructions to [activate](#activate-an-environment) an environment. Then do `conda search numpy`. You should get a long list of numpy packages. Review this output, but take note of the highlighted portions in the image. The section with a red selection shows the numpy versions that are available, The section with a blue selection shows the channel where each numpy version is stored. - -![!Search output from using conda search in Terminal](images/channel_conda_search.png) - -You can then install numpy with a specific version and from a specific channel with. - -```bash - conda install -c conda-forge numpy=2.0.0rc2 -``` - -- Using Specific Channels: You can also get packages using specific Anaconda Channels listed below. - - - Anaconda Main Channel: The default channel provided by Anaconda, Inc. Visit [Anaconda](https://anaconda.org) - - - Conda-Forge: A community-driven channel with a wide variety of packages.Visit [Conda-Forge](https://conda-forge.org/) - - - Bioconda: A channel specifically for bioinformatics packages. Visit [Bioconda](https://bioconda.github.io/) - -You can specify a channel in your search, and it will show you a list of the packages available in that channel using `conda search -c `, remember to replace and with the channel and package names you are searching for respectively. - -```bash - conda search -c conda-forge numpy -``` - -If we find the package at one of these sources, we check the Platform version to ensure it is either noarch (if available) or linux for it to work on Cheaha ("noarch" is usually preferred for the sake of portability). Noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on Anaconda or PyPI. If so, the package is being maintained on Anaconda/PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). - -![!Github page for numpy, an Anaconda package](images/github_conda_releases.png) - -If we don't find a package using Google, or the Anaconda/PyPI pages are out of date, then it may become very hard to use the software in an Anaconda environment. It is possible to try installing a git repository using pip, but care must be taken to choose the right commit or tag. You can find more [info here](https://pip.pypa.io/en/stable/cli/pip_install/#examples). To search for a git repository try: - -1. github "name". -1. gitlab "name". - -Remember to replace name with name of Anaconda package. - - -!!! note - -There are issues with out-of-date software. It may have bugs that have since been fixed and so makes for less reproducible science. Documentation may be harder to find if it isn't also matched to the software version. Examining the README.md file for instructions may provide some good information on installing the package. You can also reach out to us for [support](../help/support.md) in installing a package. - - -When we have a complete list of Anaconda packages and Channels, then we can create an environment from scratch with all the dependencies included. For Anaconda packages, add one line to dependencies for each software. For PyPI packages add - pip: under dependencies. Then under - pip:add `==` to pin the version, see below. The advantage to using an environment file is that it can be stored with your project in GitHub or GitLab, giving it all the benefits of [version control](./git_collaboration.md). - -```yaml -name: test-env -dependencies: - - anaconda::matplotlib=3.8.4 # Pinned version from anaconda channel - - conda-forge::python=3.10.4 # Pinned version from conda-forge channel - - pip - - pip: - - numpy==1.26.4 # Pinned version for pip - - git+https://github.com/user/repo.git # Example of installing from a Git repo - - http://insert_package_link_here # For URL links -``` - - For git repos, add them under `- pip:`. For examples, please see . See the section [Replicability versus Portability](#replicability-versus-portability) for more information. - -The above configuration is only for illustration purposes, to show how channels and dependencies can be used. It is best to install all of your packages from conda channels, to avoid your environment breaking. Only packages that are unavailable via conda, should be installed via pip. If you run into challenges please [contact us](../index.md#how-to-contact-us). - -##### Key Things to Remember - -1. Exploring Package Documentation: For each package, check the documentation to understand its features, version history, and compatibility. Documentation can often be found on the Anaconda Cloud package page under the "Documentation" or "Homepage" link shared above in this tutorial. - -1. Regularly consider updating your environment file to manage dependencies and maintain compatible software environments. Also newer software tends to resolve older bugs, consequently improving the state of science. - -1. Verify Package Version and Maintenance: Ensure you are getting the latest version of the package that is compatible with your environment. Verify that the package is actively maintained by checking the source repository (e.g., GitHub, GitLab). Look for recent commits, releases, and issue resolutions. The concepts of version pinning and semantic versioning, explain this in detail. - -##### Version Pinning - -Version pinning in Anaconda environments involves specifying exact versions of packages to ensure consistency and compatibility. This practice is crucial for reproducibility, as it allows environments to be reproduced exactly, a critical component in research and collaborative projects. Version pinning also aids stability, by preventing unexpected changes that could break your environment, code or analysis. This practice also maintains compatibility between different packages that rely on specific dependencies. To implement version pinning, you can create a YAML file that lists the exact versions of all installed packages or specify versions directly when [creating](#create-an-environment) or updating environments using Conda commands. - -##### Semantic Versioning - -[Semantic versioning](https://semver.org) is a versioning scheme using a three-part format (MAJOR.MINOR.PATCH) to convey the significance of changes in a software package. In Anaconda environments, it plays a role in managing compatibility, version pinning, dependency resolution, and updating packages. The MAJOR version indicates incompatible API changes, i.e. same software package but operation and interaction are mostly different from what you are accustomed to in the previous version. The MINOR version adds backward-compatible functionality, i.e. same version of software package but now contains new features and functionality. Operations and interactions are still mostly the same. While PATCH version includes backward-compatible bug fixes, i.e. same major and minor versions now have a slight change, perhaps a bug or some small change, still same features, operations and interactions, just the minor bug fix. Using semantic versioning helps maintain consistency and compatibility by ensuring that updates within the same major version are compatible, and by allowing precise control when specifying package versions. - -In practice, updating a Major version of a package may break your workflow, but may increase software reliability, stability and fix bugs affecting your science. Changing the major version may also introduce new bugs, these concerns and some others are some of the tradeoffs that have to be taken into consideration. Semantic versioning helps with managing Anaconda environments by facilitating precise [version pinning](#version-pinning) and dependency resolution. For instance, you can pin specific versions using Conda commands or specify version ranges to ensure compatibility as shown in the examples above. Semantic versioning also informs upgrade strategies, letting us know when to upgrade packages based on the potential impact of changes. By leveraging semantic versioning, you can maintain stable and consistent environments, which is essential for smooth research workflows. - -#### Good Software Development Practice - -Building on the example above, we can bring in good software development practices to ensure we don't lose track of how our environment is changing as we develop our software or our workflows. If you've ever lost a lot of hard work by accidentally deleting an important file, or forgetting what changes you've made that need to be rolled back, this section is for you. - -Efficient software developers live the mantra "Don't repeat yourself". Part of not repeating yourself is keeping a detailed and meticulous record of changes made as your software grows over time. [Git](git_collaboration.md) is a way to have the computer keep track of those changes digitally. Git can be used to save changes to environment files as they change over time. Remember that each time your environment changes to commit the output of [Exporting your Environment](#exporting-an-environment) to a repository for your project. diff --git a/docs/workflow_solutions/using_conda.md b/docs/workflow_solutions/using_conda.md new file mode 100644 index 000000000..8df781e2f --- /dev/null +++ b/docs/workflow_solutions/using_conda.md @@ -0,0 +1,419 @@ +# Why Use Conda? + + +!!! important + + Recent changes to the Anaconda Terms of Service have required all UAB researchers to change how they use `conda`. See our [Conda Migration FAQ](conda_migration_faq.md) for more information. + + +Python is a high level programming language that is widely used in many branches of science. As a result, many scientific software packages have been developed in Python, leading to the development of a package manager called `conda`. `conda` is the most popular and widely-supported Python package management software for scientific research. + +Benefits of `conda`: + +- Shareability: environments can be shared via human-readable, text-based YAML files. +- Maintainability: the same YAML files can be version controlled using git. +- Repeatability: environments can be rebuilt using those same YAML files. Libraries are pre-built and stored on remote servers for download instead of being built on your local machine or on Cheaha, so two computers with the same operating system, requesting the same package version, will end up using the same executable. +- Simplicity: dependency matrices are computed and solved by `conda`, and +- Ubiquity: nearly all Python developers are aware of the usage of `conda`, especially in scientific research, so there are many resources available for learning how to use it, and what to do if something goes wrong. + +`conda` can also install `pip` and record which `pip` packages are installed, so `conda` can do everything Pip can, and more. + + +!!! important + + If using `conda` on Cheaha, please see our [`conda` on Cheaha page](../cheaha/software/software.md#conda-on-cheaha) for important details and restrictions. + + +## Important Terms + +- **`conda`**: Refers to the executable software program that researchers interact with to create and manage environements and install packages. +- **Conda**: Refers to a software distribution containing `conda` and related software and features. +- **package**: Reearch-related software installed and managed by `conda`, held in environments. Packages are selected from channels and downloaded from remote data servers. +- **environment**: A collection of packages that `conda` can manage. Users can switch between environments to allow for development of multiple projects that have different requirements. +- **YAML file**: A structured, human-friendly file definining a single environment. Sharing the file with others allows for replication of an environment. These files enhance collaboration when added to your project's [version control](../workflow_solutions/git.md), especially when shared on [GitHub or GitLab](../workflow_solutions/git_collaboration.md). YAML stands for [Yet Another Markup Language](https://yaml.org/). +- **channel**: A listing of packages available for download. + - The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. + - The `conda-forge` and `bioconda` channels are free to use. +- **version**: A string of numbers and dots `.` denoting the version of a package. Often these are structured like `1.2.3` and most of the time follow [Semantic Versioning](https://semver.org/) conventions, but not always. Larger numbers indicate more recent versions. Some are structured using dates instead like `2024.08`, with more recent dates indicating more recent versions. + + +!!! note + + We use CAPITAL LETTERS to denote where you will need to replace text with your own values, such as `ENVIRONMENT`, `PACKAGE`, `CHANNEL`, and `VERSION`. + + CAPITAL LETTERS prefixed by a dollar sign `$` are shell variables and do not need to be substituted. + + +## Using Conda + +`conda` is a package manager, meaning it handles all of the difficult mathematics and logistics of figuring out exactly what versions of which packages should be downloaded to meet your needs, or inform you if there is a conflict. + +`conda` is structured around environments. Environments are self-contained collections of researcher-selected packages. Environments can be changed out using a simple package without requiring tedious installing and uninstalling of packages or software, and avoiding dependency conflicts with each other. Environments allow researchers to work and collaborate on multiple projects, each with different requirements, all on the same computer. Environments can be installed from the command line, from pre-designed or shared YAML files, and can be modified or updated as needed. + +The following subsections detail some of the more common commands and use cases for `conda` usage. More complete information on this process can be found at the [`conda` documentation](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#). Need some hands-on experience? You can find instructions on how to install PyTorch and TensorFlow using `conda` in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). + + +!!! important + + If using `conda` on Cheaha, please see our [`conda` on Cheaha page](../cheaha/software/software.md#conda-on-cheaha) for important details and restrictions. + + +### Create an Environment + +In order to create an empty environment you can install packages into, use the `conda env create` command. + +```bash +# Create an empty environment. +conda env create --name ENVIRONMENT +``` + +If you are trying to replicate a pipeline or analysis from another person, you can also recreate an environment using a YAML file, if one was provided. + +```bash +# Replicate an environment from a YAML file named environment.yml. +conda env create --file environment.yml +``` + +On Cheaha all of your conda environments are stored in `/home/$USER/.conda/envs`, by default. + +### Activate an Environment + +From here, you can activate the environment using the `conda activate` command. + +```bash +conda activate ENVIRONMENT +``` + +When your environment has loaded, your terminal prompt should change to look similar to the following. + +```text +(ENVIRONMENT) [BlazerID@c0000 ~]$ +``` + +Once the environment is activated, you are able to install any python libraries needed for your analysis. + +### Install Packages + +To install packages using `conda`, use the `conda install` command. + + +!!! important + + The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. The `conda-forge` and `bioconda` channels are free to use. + + +```bash +# Install from default channels. NOT recommended! +conda install PACKAGE # most recent version possible +conda install PACKAGE=VERSION # specified version + +# Install from a specified channel. Recommended! +conda install CHANNEL::PACKAGE # most recent version possible +conda install CHANNEL::PACKAGE=VERSION # specified version +``` + +#### Installing Packages With Pip + +When building a `conda` environment, prefer to get all of your packages through `conda` channels to maximize compatibility. Some packages are not available through `conda` channels. Often these packages are available via [PyPI](https://pypi.org/) and may be installed using the Pip package manager. Pip may also be used to install locally-available packages, and directly from GitHub and GitLab repositories. + + +!!! important + + When using `conda` and `pip` on Cheaha, make sure you are using a custom `conda` environment and that `pip` is installed before installing `pip` packages to prevent severe [`pip` related issues](../cheaha/software/software.md#installing-pip-packages-outside-of-your-environments). + + + +!!! warning + + There are some hard-to-diagnose error that occur when installing packages using `pip` on Windows. The errors occur for Python versions between `3.10.4` and `3.10.8`, and may impact others in the `3.10` series. To maximize shareability, it is recommended to avoid those versions of Python, if possible. The issue does not appear to occur with Python `3.10.14`. + + +```bash +# Install packages using pip. +pip install PACKAGE # most recent version possible +pip install PACKAGE==VERSIOn # specified version, note the `==` +pip install -r packages.txt # multiple packages from a list in a text file +``` + +#### Finding Packages + + +!!! important + + The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. The `conda-forge` and `bioconda` channels are free to use. + + +To find packages available on `conda` channels, use a search engine like Google. Start by searching for `PACKAGE conda-forge`. Replace `PACKAGE` with the name of the package. You might also try `bioconda` instead of `conda-forge`. If the package has a name shared with non-software products or ideas, you may need to add `software` or `research`, or both, to the end of your search string. You can also search on , but be sure the package you find is not from a channel subject to the Anaconda Terms of Service. + +For packages in PyPI, repeat the process above but use `pypi` in place of `conda-forge` in the search string, or search directly on . + +#### Packages for Jupyter + +For more information about using `conda` with Jupyter, see the section [Working with `conda` Environments](../cheaha/open_ondemand/ood_jupyter_notebook.md#working-with-conda-environments). + +### Update Packages in an Environment + +In research, there is a balance to be struck between keeping software up-to-date and ensuring replicability of outputs. Updating software regularly ensures you have the most recent bug fixes and the highest level of security. Not updating software means you can be sure the software will behave consistently across all of your data. + +When coming up with a software analysis strategy, carefully consider the following questions. + +- What parts of my workflow can be done all at once after experiments are done? +- What parts of my workflow must be done as data is acquired? +- What are the specific benefits of updating a software package? + - Fixing a bug that causes incorrect output? + - Major security holes patched? +- What are the costs of updating? + - Will I have to re-run some or all of my analyses? + - Will I have to update other parts of my workflow code? +- Will I have to update other packages, and what will those impacts be? +- Does a particular update change outputs? Why did the output change? + +To perform an update on the currently [activated](#activate-an-environment) environment, use the `conda update` command. + +```bash +conda update PACKAGE # updates to the most recent version possible +conda update PACKAGE=VERSION # updates (or downgrades) to a specific version +conda update -—all # updates all packages to the most recent version possible +``` + +### Deactivating an Environment + +An environment can be deactivated using the following command. + +```bash +conda deactivate +``` + +Closing the terminal will also close out the environment. + +### Deleting an Environment + +To delete an environment, use the following command. + +```bash +conda ENVIRONMENT remove --name +``` + +### Working With Environment YAML Files + +#### Exporting an Environment + +To easily share environments with other researchers or replicate it on a new machine, it is useful to create an environment YAML file. + +```bash +# activate the environment if it is not active already +conda activate ENVIRONMENT + +# export the environment to a YAML file +conda env export > environment.yml +``` + +#### Creating an Environment From a YAML File + +To create an environment from a YAML file `environment.yml`, use the following command. + +```bash +conda env create --file environment.yml +``` + +#### Sharing Your Environment File + +To share your environment for collaboration, there are three ways to export environments. + +```bash +# Cross-platform compatible. +conda env export --name ENVIRONMENT --from-history > environment.yml + +# Platform and package specific. +conda env export --name ENVIRONMENT > environment.yml + +# Platform and package and channel specific +conda list --name ENVIRONMENT --explicit > environment.yml +``` + +#### Replicability Versus Portability + +An environment with only `python 3.10.4`, `numpy 1.21.5` and `jinja2 2.11.2` installed will output something like the following file when `conda env export` is used. This file may be used to precisely replicate the environment as it exists on the machine where `conda env export` was run. Note that the versioning for each package contains two `=` signs. The code like `he774522_0` after the second `=` sign contains hyper-specific build information for the compiled libraries for that package. Sharing this exact file with collaborators may result in frustration if they do not have the exact same operating system and hardware as you, and they would not be able to build this environment. We would say that this environment file is not very portable. + +There are other portability issues: + +- The `prefix: C:\...` line is not used by `conda` in any way and is deprecated. It also shares system information about file locations which is potentially sensitive information. +- The `channels:` group uses `- defaults`, which may vary depending on how you or your collaborator has customized their `conda` installation. It may result in packages not being found, resulting in environment creation failure. + +```yaml +name: test-env +channels: + - defaults +dependencies: + - blas=1.0=mkl + - bzip2=1.0.8=he774522_0 + - ca-certificates=2022.4.26=haa95532_0 + - certifi=2021.5.30=py310haa95532_0 + - intel-openmp=2021.4.0=haa95532_3556 + - jinja2=2.11.2=pyhd3eb1b0_0 + - libffi=3.4.2=h604cdb4_1 + - markupsafe=2.1.1=py310h2bbff1b_0 + - mkl=2021.4.0=haa95532_640 + - mkl-service=2.4.0=py310h2bbff1b_0 + - mkl_fft=1.3.1=py310ha0764ea_0 + - mkl_random=1.2.2=py310h4ed8f06_0 + - numpy=1.21.5=py310h6d2d95c_2 + - numpy-base=1.21.5=py310h206c741_2 + - openssl=1.1.1o=h2bbff1b_0 + - pip=21.2.4=py310haa95532_0 + - python=3.10.4=hbb2ffb3_0 + - setuptools=61.2.0=py310haa95532_0 + - six=1.16.0=pyhd3eb1b0_1 + - sqlite=3.38.3=h2bbff1b_0 + - tk=8.6.11=h2bbff1b_1 + - tzdata=2022a=hda174b7_0 + - vc=14.2=h21ff451_1 + - vs2015_runtime=14.27.29016=h5e58377_2 + - wheel=0.37.1=pyhd3eb1b0_0 + - wincertstore=0.2=py310haa95532_2 + - xz=5.2.5=h8cc25b3_1 + - zlib=1.2.12=h8cc25b3_2 +prefix: C:\Users\user\miniforge3\envs\test-env +``` + +To make this a more portable file, suitable for collaboration, some planning is required. Instead of using `conda env export` we can build our own file. Create a new file called `environment.yml` using your favorite text editor and add the following. Note we've only listed exactly the packages we installed, and their version numbers, only. This allows `conda` the flexibility to choose dependencies which do not conflict and do not contain unusable hyper-specific library build information. + +```yaml +name: test-env +channels: + - anaconda +dependencies: + - jinja2=2.11.2 + - numpy=1.21.5 + - python=3.10.4 +``` + +This is a much more readable and portable file suitable for sharing with collaborators. We aren't quite finished though! Some scientific packages on the `conda-forge` channel, and on other channels, can contain dependency errors. Those packages may accidentally pull a version of a dependency that breaks their code. + +For example, the package `markupsafe` made a not-backward-compatible change (a breaking change) to their code between `2.0.1` and `2.1.1`. Dependent packages expected `2.1.1` to be backward compatible, so their packages allowed `2.1.1` as a substitute for `2.0.1`. Since `conda` chooses the most recent version allowable, package installs broke. To work around this for our environment, we would need to modify the environment to "pin" that package at a specific version, even though we didn't explicitly install it. + +```yaml +name: test-env +channels: + - anaconda +dependencies: + - jinja2=2.11.2 + - markupsafe=2.0.1 + - numpy=1.21.5 + - python=3.10.4 +``` + +Now we can be sure that the correct versions of the software will be installed on our collaborator's machines. + +It is important to be aware that by generalizing the YAML file in this way, the results you and your collaborator each generate may be different. This could be due to the previously-mentioned difference in hardware and operating system. If precise replication is required, more effort may be required such as using [Containers](getting_containers.md#create-your-own-docker-container) to ensure a consistent operating system environment. + + +!!! note + + The example above is provided only for illustration purposes. The error has since been fixed, but the example above really happened and is helpful to explain version pinning. + + +#### Good Practice for Finding Software Packages on Conda + +Finding `conda` software packages involves searching through the available channels and repositories to locate the specific packages that contain functions that you need for your environment. Channels instruct `conda` where to look for packages when installation is to be done. In the sections below, you will see information on how to locate packages important for your work, ensure the packages are up-to-date, figure out the best way to install them, and finally compose an environment file for portability and replicability. + +##### Step-by-Step Guide to Finding Conda Software Packages + +If we find the package at one of the channel sources mentioned above, we can check the Platform version to ensure it is either "noarch" (if available) or linux. After noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on the `conda-forge` channel or PyPI. If so, the package is being maintained on `conda-forge` or PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). We prefer searching using the following methods, and usually have the most success in the order listed below. + +- Using Google: You may already be familiar with the exact `conda` package name you require. In the event this is not the case, a simple web engine search with key words usually finds the package. For example, a web search for a `conda` package would be something along the lines of "conda package for `Generic Topic Name`". Your search results, should return popular package names related to the topic you have searched for. In the sections below, there is an attempt to provide a detailed step-by-step guide on how to find Anaconda packages using "numpy" as an example. + +- Anaconda Cloud: Anaconda Cloud is the primary source for finding Anaconda packages. You can visit [Anaconda Cloud](https://anaconda.org/) and use the search bar to find the package you need. For example, when you get the package name from your web search (using numpy). You will enter name of the package in the search bar as shown below. + +![!Landing page of anaconda.org showing search](images/anaconda_search.png) + +Review results of your search, it is advised to use "Artifacts" that are compatible with the platform you are working with, as well as have the most "Favorites" and "Downloads" numbers. Click on the portion that contains the name of the package (highlighted 3 in the image below). 1 highlights the Artifact, Favorite and Downloads numbers, the selection 2 highlights the channel where this package is stored. + +![!Anaconda.org page showing download statistics](images/anaconda_channel_package.png) + +Follow the installation instructions you see in the image below. + +![!Anaconda.org page showing package installation instructions](images/install_anaconda_package.png) + +- Using the `conda` Search Command: You can use the `conda search ` command directly in your terminal to find packages. Replace `` with the package you would like to search for. To do this on Cheaha, make sure to `module load Anaconda3` first, and follow the instructions to [activate](#activate-an-environment) an environment. Then do `conda search numpy`. You should get a long list of numpy packages. Review this output, but take note of the highlighted portions in the image. The section with a red selection shows the numpy versions that are available, The section with a blue selection shows the channel where each numpy version is stored. + +![!Search output from using conda search in Terminal](images/channel_conda_search.png) + +You can then install numpy with a specific version and from a specific channel with. + +```bash + conda install -c conda-forge numpy=2.0.0rc2 +``` + +- Using Specific channels: You can also get packages using specific `conda` channels listed below. + + - Anaconda Main channel: The default channel provided by Anaconda, Inc. Visit [Anaconda](https://anaconda.org) + + - Conda-Forge: A community-driven channel with a wide variety of packages.Visit [Conda-Forge](https://conda-forge.org/) + + - Bioconda: A channel specifically for bioinformatics packages. Visit [Bioconda](https://bioconda.github.io/) + +You can specify a channel in your search, and it will show you a list of the packages available in that channel using `conda search -c `, remember to replace and with the channel and package names you are searching for respectively. + +```bash + conda search -c conda-forge numpy +``` + +If we find the package at one of these sources, we check the Platform version to ensure it is either noarch (if available) or linux for it to work on Cheaha ("noarch" is usually preferred for the sake of portability). Noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on `conda-forge` or PyPI. If so, the package is being maintained on `conda-forge` or PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). + +![!Github page for numpy, a `conda` package](images/github_conda_releases.png) + +If we don't find a package using Google, or the `conda-forge` and PyPI pages are out of date, then it may become very hard to use the software in a `conda` environment. It is possible to try installing a git repository using pip, but care must be taken to choose the right commit or tag. You can find more [info here](https://pip.pypa.io/en/stable/cli/pip_install/#examples). To search for a git repository try: + +1. github "name". +1. gitlab "name". + +Remember to replace name with name of Anaconda package. + + +!!! note + +There are issues with out-of-date software. It may have bugs that have since been fixed and so makes for less reproducible science. Documentation may be harder to find if it isn't also matched to the software version. Examining the README.md file for instructions may provide some good information on installing the package. You can also reach out to us for [support](../help/support.md) in installing a package. + + +When we have a complete list of `conda` packages and channels, then we can create an environment from scratch with all the dependencies included. For `conda` packages, add one line to dependencies for each software. For PyPI packages add - pip: under dependencies. Then under - pip:add `==` to pin the version, see below. The advantage to using an environment file is that it can be stored with your project in GitHub or GitLab, giving it all the benefits of [version control](./git_collaboration.md). + +```yaml +name: test-env +dependencies: + - anaconda::matplotlib=3.8.4 # Pinned version from anaconda channel + - conda-forge::python=3.10.4 # Pinned version from conda-forge channel + - pip + - pip: + - numpy==1.26.4 # Pinned version for pip + - git+https://github.com/user/repo.git # Example of installing from a Git repo + - http://insert_package_link_here # For URL links +``` + +Add git repos under `- pip:` as shown in the [official documentation examples](https://pip.pypa.io/en/stable/cli/pip_install/#examples). See the section [Replicability versus Portability](#replicability-versus-portability) for more information. + +The above configuration is only for illustration purposes, to show how channels and dependencies can be used. It is best to install all of your packages from conda channels, to avoid your environment breaking. Only packages that are unavailable via conda, should be installed via pip. If you run into challenges please [contact us](../index.md#how-to-contact-us). + +##### Key Things to Remember + +1. Exploring Package Documentation: For each package, check the documentation to understand its features, version history, and compatibility. Documentation can often be found on the Anaconda Cloud package page under the "Documentation" or "Homepage" link shared above in this tutorial. + +1. Regularly consider updating your environment file to manage dependencies and maintain compatible software environments. Also newer software tends to resolve older bugs, consequently improving the state of science. + +1. Verify Package Version and Maintenance: Ensure you are getting the latest version of the package that is compatible with your environment. Verify that the package is actively maintained by checking the source repository (e.g., GitHub, GitLab). Look for recent commits, releases, and issue resolutions. The concepts of version pinning and semantic versioning, explain this in detail. + +##### Version Pinning + +Version pinning in `conda` environments involves specifying exact versions of packages to ensure consistency and compatibility. This practice is crucial for reproducibility, as it allows environments to be reproduced exactly, a critical component in research and collaborative projects. Version pinning also aids stability, by preventing unexpected changes that could break your environment, code or analysis. This practice also maintains compatibility between different packages that rely on specific dependencies. To implement version pinning, you can create a YAML file that lists the exact versions of all installed packages or specify versions directly when [creating](#create-an-environment) or updating environments using Conda commands. + +##### Semantic Versioning + +[Semantic versioning](https://semver.org) is a versioning scheme using a three-part format (MAJOR.MINOR.PATCH) to convey the significance of changes in a software package. In `conda` environments, it plays a role in managing compatibility, version pinning, dependency resolution, and updating packages. The MAJOR version indicates incompatible API changes, i.e. same software package but operation and interaction are mostly different from what you are accustomed to in the previous version. The MINOR version adds backward-compatible functionality, i.e. same version of software package but now contains new features and functionality. Operations and interactions are still mostly the same. While PATCH version includes backward-compatible bug fixes, i.e. same major and minor versions now have a slight change, perhaps a bug or some small change, still same features, operations and interactions, just the minor bug fix. Using semantic versioning helps maintain consistency and compatibility by ensuring that updates within the same major version are compatible, and by allowing precise control when specifying package versions. + +In practice, updating a Major version of a package may break your workflow, but may increase software reliability, stability and fix bugs affecting your science. Changing the major version may also introduce new bugs, these concerns and some others are some of the tradeoffs that have to be taken into consideration. Semantic versioning helps with managing `conda` environments by facilitating precise [version pinning](#version-pinning) and dependency resolution. For instance, you can pin specific versions using Conda commands or specify version ranges to ensure compatibility as shown in the examples above. Semantic versioning also informs upgrade strategies, letting us know when to upgrade packages based on the potential impact of changes. By leveraging semantic versioning, you can maintain stable and consistent environments, which is essential for smooth research workflows. + +#### Good Software Development Practice + +Building on the example above, we can bring in good software development practices to ensure we don't lose track of how our environment is changing as we develop our software or our workflows. If you've ever lost a lot of hard work by accidentally deleting an important file, or forgetting what changes you've made that need to be rolled back, this section is for you. + +Efficient software developers live the mantra "Don't repeat yourself". Part of not repeating yourself is keeping a detailed and meticulous record of changes made as your software grows over time. [Git](git_collaboration.md) is a way to have the computer keep track of those changes digitally. Git can be used to save changes to environment files as they change over time. Remember that each time your environment changes to commit the output of [Exporting your Environment](#exporting-an-environment) to a repository for your project. diff --git a/mkdocs.yml b/mkdocs.yml index 0f83ff2eb..eb20eb23b 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -60,8 +60,8 @@ plugins: # order matters! scripts: - scripts/pandoc_generator.py - git-revision-date-localized: - type: date strict: false + type: date - glightbox - table-reader: data_path: docs @@ -90,7 +90,7 @@ plugins: # order matters! data_management/lts/sharing.md: data_management/lts/iam_and_policies.md data_management/cheaha_storage_gpfs/project_directory_organization.md: data_management/cheaha_storage_gpfs/project_directories.md#project-directory-organization data_management/storage.md: data_management/index.md - environment_management/anaconda_environments.md: workflow_solutions/using_anaconda.md + environment_management/anaconda_environments.md: workflow_solutions/using_conda.md environment_management/containers.md: workflow_solutions/getting_containers.md environment_management/git.md: workflow_solutions/git_collaboration.md uab_cloud/cloud_remote_access.md: uab_cloud/remote_access.md @@ -102,6 +102,7 @@ plugins: # order matters! welcome/rc_days.md: education/research_computing_days/index.md welcome/welcome.md: index.md workflow_solutions/getting_software_with_git.md: workflow_solutions/git_collaboration.md + workflow_solutions/using_anaconda.md: workflow_solutions/using_conda.md exclude_docs: | /**/res/*.md @@ -157,7 +158,8 @@ nav: - Research Data Responsibilities: data_management/research_data_responsibilities.md - Workflow Solutions: - Using the Shell: workflow_solutions/shell.md - - Using Anaconda: workflow_solutions/using_anaconda.md + - Using Conda: workflow_solutions/using_conda.md + - Conda Migration FAQ: workflow_solutions/conda_migration_faq.md - Using Workflow Managers: workflow_solutions/using_workflow_managers.md - Using Git: workflow_solutions/git.md - R Projects and Environments: workflow_solutions/r_environments.md