diff --git a/docs/cheaha/getting_started.md b/docs/cheaha/getting_started.md index 084282dc5..04f6b97b8 100644 --- a/docs/cheaha/getting_started.md +++ b/docs/cheaha/getting_started.md @@ -164,13 +164,13 @@ Slurm is our job queueing software used for submitting any number of job scripts A large variety of software is available on Cheaha as modules. To view and use these modules see [the following documentation](./software/modules.md). -For new software installation, please try searching [Anaconda](../workflow_solutions/using_anaconda.md) for packages first. If you still need help, please [send a support ticket](../help/support.md) +For new software installation, please try searching for [`conda` packages](../cheaha/tutorial/conda_good_practice.md#step-by-step-guide-to-finding-conda-software-packages) first. If you still need help, please [send a support ticket](../help/support.md) ### Conda Packages -A significant amount of open-source software is distributed as Anaconda or Python libraries. These libraries can be installed by the user without permission from Research Computing using Anaconda environments. To read more about using Anaconda virtual environments see our [Anaconda page](./software/software.md#anaconda-on-cheaha). +A significant amount of open-source research software is distributed as `conda` packages or Python libraries. These libraries can be installed by the user using `conda` environments. To read more about using `conda` environments see our [`conda` page](./software/software.md#conda-on-cheaha). -If the software installation instructions tell you to use either `conda install` or `pip install` commands, the software and its dependencies can be installed using a virtual environment. +If the software installation instructions tell you to use either `conda install` or `pip install` commands, the software and its dependencies can be installed using a `conda` environment. ## How to Get Help diff --git a/docs/cheaha/open_ondemand/ood_jupyter.md b/docs/cheaha/open_ondemand/ood_jupyter.md index 987dc3ee7..84fd4bfe3 100644 --- a/docs/cheaha/open_ondemand/ood_jupyter.md +++ b/docs/cheaha/open_ondemand/ood_jupyter.md @@ -1,6 +1,6 @@ # Jupyter Apps -Jupyter Notebooks and Jupyter Lab are both available as standalone apps in OOD. Jupyter is commonly used with Anaconda environments. If you are unfamiliar with Anaconda environments please see the [Working with Anaconda Environments section](#working-with-anaconda-environments) below before continuing here. +Jupyter Notebooks and Jupyter Lab are both available as standalone apps in OOD. Jupyter is commonly used with `conda` environments. If you are unfamiliar with `conda` environments please see the [Working with `conda` Environments section](#working-with-conda-environments) below before continuing here. To launch the Jupyter notebook, select the menus 'Interactive Apps -> Jupyter Notebook'. The job creation and submission form appears: @@ -10,7 +10,7 @@ As with all interactive apps, you'll need to select the resources required using ## Environment Setup -To modify the environment that Anaconda and Jupyter will run in, please use the Environment Setup field to load modules and modify the environment `$PATH`. Be aware that any changes to the environment made in this window will be inherited by terminals as well as notebooks opened within Jupyter. +To modify the environment that `conda` and Jupyter will run in, please use the Environment Setup field to load modules and modify the environment `$PATH`. Be aware that any changes to the environment made in this window will be inherited by terminals as well as notebooks opened within Jupyter. ### CUDA @@ -34,17 +34,17 @@ The `Extra Jupyter Arguments` field allows you to pass additional arguments to t ## Working with other programming languages within Jupyter Notebook -To work with other programming languages within Jupyter Notebook, you need to install the corresponding kernel for each language, similar to the process used for Python with the `ipykernel`. This can be done using package managers such as `pip` or `conda`, or by following language-specific instructions. For example, to install `R kernel` for the R language, we can run the `conda install -c r r-essentials` command. Please ensure that the kernel is installed in your Anaconda environment. Then, select the desired language environment from the kernel dropdown menu. +To work with other programming languages within Jupyter Notebook, you need to install the corresponding kernel for each language, similar to the process used for Python with the `ipykernel`. This can be done using package managers such as `pip` or `conda`, or by following language-specific instructions. For example, to install `R kernel` for the R language, we can run the `conda install -c r r-essentials` command. Please ensure that the kernel is installed in your `conda` environment. Then, select the desired language environment from the kernel dropdown menu. Once the necessary kernels are installed, if you wish, you can write and run multiple code cells in different languages within a single notebook. Easily switch between kernels and select the preferred one for each language, and then proceed to run the code cells in their respective languages. -## Working with Anaconda Environments +## Working with `conda` Environments -By default, Jupyter notebooks will use the base environment that comes with the Anaconda3 module. This environment contains a large number of popular packages and may useful for something quick, dirty, and simple. However, for any analysis needing specific package versions or special packages, you will need to create your own environment and select it from the `Kernel` menu. For information on creating and managing Anaconda environments please see our [Using Anaconda page](../../workflow_solutions/using_anaconda.md). Then please review our [Cheaha-specific Anaconda page](../software/software.md#anaconda-on-cheaha) for important tips and how to avoid common pitfalls. +By default, Jupyter notebooks will use the base environment that comes with the `Miniforge3` module. This environment contains a large number of popular packages and may useful for something quick, dirty, and simple. However, for any analysis needing specific package versions or special packages, you will need to create your own environment and select it from the `Kernel` menu. For information on creating and managing `conda` environments please see our [Using `conda` page](../../workflow_solutions/using_conda.md). Then please review our [Cheaha-specific `conda` page](../software/software.md#conda-on-cheaha) for important tips and how to avoid common pitfalls. -To change the kernel, use the `Kernel` dropdown and select `Change Kernel`. From the list, choose the kernel corresponding to your desired Anaconda environment (see below for an example). If your environment isn't appearing, you may be missing the ipykernel package. To do so, use `conda install ipykernel` to get ipykernel packgae installed into your environment, so Jupyter can recognize your environment. +To change the kernel, use the `Kernel` dropdown and select `Change Kernel`. From the list, choose the kernel corresponding to your desired `conda` environment (see below for an example). If your environment isn't appearing, you may be missing the ipykernel package. To do so, use `conda install ipykernel` to get ipykernel packgae installed into your environment, so Jupyter can recognize your environment. -![! Select your Anaconda environment from the Kernel dropdown menu in Jupyter](images/jupyter_kernel.png) +![! Select your `conda` environment from the Kernel dropdown menu in Jupyter](images/jupyter_kernel.png) ### Creating an Environment for use with Jupyter Notebook @@ -55,9 +55,9 @@ We can create a new environment, that houses all of the packages, modules, and l - [OOD Terminal](./ood_layout.md#opening-a-terminal). Be sure to run the following steps in a job! - [OOD HPC Desktop Job Terminal](./hpc_desktop.md). This method will ensure terminal commands are run in a job. -1. [Create](../../workflow_solutions/using_anaconda.md#create-an-environment) and [activate](../../workflow_solutions/using_anaconda.md#activate-an-environment) your new environment, following the linked steps. +1. [Create](../../workflow_solutions/using_conda.md#create-an-environment) and [activate](../../workflow_solutions/using_conda.md#activate-an-environment) your new environment, following the linked steps. -1. [Install your desired packages into your activated environment](../../workflow_solutions/using_anaconda.md#install-packages). +1. [Install your desired packages into your activated environment](../../workflow_solutions/using_conda.md#install-packages). 1. Remember to install 'ipykernel' in your activated environment, using `conda install ipykernel`. @@ -81,9 +81,15 @@ We can create a new environment, that houses all of the packages, modules, and l ### Python Executable Issues -Jupyter Notebook by default loads `Anaconda3`. Hence do not load any versions of `Anaconda3` module in the `Environment Setup` field in the OOD Jupyter Notebook, as it causes Python mismatch, and the errors are hard to diagnose. +Jupyter Notebook by default loads `Miniforge3`. Hence do not load any versions of `Miniforge3` module in the `Environment Setup` field in the OOD Jupyter Notebook, as it causes Python mismatch, and the errors are hard to diagnose. -Having custom installs of Anaconda/Miniconda/ can cause the above similar issue. If you have installations of any of these software in your personal space, delete those directories and instead use the `Anaconda3` module. +Having self-installed `conda` software can cause the above issue. This includes self-installed Anaconda, Miniconda, Mambaforge, or Miniforge. If you have installations of any of these software in your personal space, delete those directories and instead use the `Miniforge3` module. + + +!!! important + + The Anaconda and Miniconda software are subject to the Anaconda Terms of Service and may not be used for UAB business. + To identify a Python mismatch, use the commands `which python` and `python --version` to confirm the desired Python executable and version. Within the `conda` environment, `which python` prints the path of the Python executable (e.g. `~/.conda/envs/remora/bin/python`). If it doesn't match the expected version, an unexpected Python version may be in use. @@ -105,22 +111,17 @@ While launching an OOD HPC Desktop Job or any OOD Applications, if the user gets Using `conda init` causes a block of code automatically inserted into the `.bashrc` file in your `$HOME` directory. This code block may interfere with the proper functioning of various OOD applications, resulting in a VNC error. To address this issue, it is recommended to follow the instructions outlined in the [FAQ entry](https://ask.cyberinfrastructure.org/t/why-do-i-get-an-error-when-launching-an-open-ondemand-hpc-interactive-session/2496). -### Pip Installs Packages Outside of Environment +### Installing Pip Packages Outside of Your Environments -When installing packages within a `conda` environment using `pip`, it's crucial to ensure that you install `pip` within the same conda environment and use `pip` from that environment. If `pip` is used outside of Anaconda or within an environment without `pip` installed, the packages are installed to `~/.local`. This can lead to unexpected package conflicts, as Python loads packages from `~/.local` before loading from Anaconda environments, and shows the following error, - -```bash -Requirement already satisfied: numpy in /home/$USER/.local/lib/python3.11/site-packages (1.26.3) -``` + +!!! danger -For the above case, resolving errors involve deleting the `~/.local` directory. + Using `pip install` without loading Miniforge3 will cause hard-to-diagnose errors and broken workflows. -Here's an example of the correct procedure for installing `pip` packages within a `conda`: + Using `pip install` in the `base` environment will cause the same hard-to-diagnose errors and broken workflows. -1. Load the `Anaconda3` module using `module load Anaconda3`. -1. Create or activate the desired Anaconda environment. Please refer to the [Anaconda documentation](../../workflow_solutions/using_anaconda.md#create-an-environment) -1. Install `pip` within the `conda` environment using `conda install pip` or `conda install python`. `pip` and `python` are packaged together, installing one will always install the other. -1. Use `pip` when this `conda` environment is active to install packages. Please refer to [Installing packages with `pip`](../../workflow_solutions/using_anaconda.md#installing-packages-with-pip) + Read more about this issue, and how to resolve it, [here](../software/software.md#installing-pip-packages-outside-of-your-environments). + ### Tensorflow and PyTorch GPU issues diff --git a/docs/cheaha/open_ondemand/ood_matlab.md b/docs/cheaha/open_ondemand/ood_matlab.md index 0a1ce2f8f..1ffa974e5 100644 --- a/docs/cheaha/open_ondemand/ood_matlab.md +++ b/docs/cheaha/open_ondemand/ood_matlab.md @@ -10,17 +10,17 @@ Matlab is available for use graphically in your browser via OOD. As with other s Matlab tends to consume substantial memory at startup. You may experience difficulty with job errors below 20 GB of total memory. -## Using Anaconda Python from within Matlab +## Using `conda` Python from within Matlab Matlab has the ability to interoperate with Python from within Matlab. The official documentation for this featuer may be found at . -This section is dedicated to using this feature with Anaconda on Cheaha. To use Python contained in an Anaconda Environment within Matlab, please use the following steps. +This section is dedicated to using this feature with `conda` on Cheaha. To use Python contained in a `conda` Environment within Matlab, please use the following steps. 1. Create an [HPC Interactive Desktop Job](hpc_desktop.md). 1. Open a terminal in that job. The following steps should all be run in this terminal unless otherwise specified. -1. [Load the Anaconda Module](../software/software.md#loading-anaconda). -1. [Create an Environment](../../workflow_solutions/using_anaconda.md#create-an-environment) in Anaconda with the packages needed. -1. [Activate the Environment](../../workflow_solutions/using_anaconda.md#activate-an-environment), +1. [Load the `conda` Module](../software/software.md#loading-conda). +1. [Create an Environment](../../workflow_solutions/using_conda.md#create-an-environment) in `conda` with the packages needed. +1. [Activate the Environment](../../workflow_solutions/using_conda.md#activate-an-environment), 1. Load the Matlab [Module](../software/modules.md). 1. Start Matlab by entering the command `matlab`. 1. Verify success by entering `pyenv` at the Matlab prompt (not the terminal window). Multiple lines of text will be returned at the prompt. Among them you should see a line like the following, with your environment name in place of ``. diff --git a/docs/cheaha/open_ondemand/ood_rstudio.md b/docs/cheaha/open_ondemand/ood_rstudio.md index 142c12502..8a23a214c 100644 --- a/docs/cheaha/open_ondemand/ood_rstudio.md +++ b/docs/cheaha/open_ondemand/ood_rstudio.md @@ -12,16 +12,16 @@ RStudio is available for use graphically in your browser via OOD. As with other ## RStudio and Python -If you have a workflow that uses both R and Python, it is strongly recommended to use the [reticulate](https://rstudio.github.io/reticulate/) package along with Anaconda environments. Reticulate allows researchers to load Python packages into a native R session as objects. For instance, if someone prefer some functionality of the `pandas` package but has other code already written in R, they can import `pandas` to R and use both simultaneously. +If you have a workflow that uses both R and Python, it is strongly recommended to use the [reticulate](https://rstudio.github.io/reticulate/) package along with `conda` environments. Reticulate allows researchers to load Python packages into a native R session as objects. For instance, if someone prefer some functionality of the `pandas` package but has other code already written in R, they can import `pandas` to R and use both simultaneously. -This also allows researchers to download precompiled command line binaries into an Anaconda environment and easliy use them in their R scripts. +This also allows researchers to download precompiled command line binaries into a `conda` environment and easily use them in their R scripts. For setup, use the following steps: 1. In a terminal on a compute node, either in an HPC Desktop job or by clicking the blue Host button on any job card: - 1. Load the `Anaconda3` module - 1. Create an Anaconda environment. More information about how to create Anaconda environments can be found [in our documentation](../../workflow_solutions/using_anaconda.md). + 1. Load the `Miniforge3` module + 1. Create a `conda` environment. More information about how to create `conda` environments can be found [in our documentation](../../workflow_solutions/using_conda.md). 1. Activate your environment and install your requuired python packages using either `pip install` or `conda install` depending on the package source. @@ -32,16 +32,16 @@ For setup, use the following steps: 1. In RStudio: - 1. Add the command `module load Anaconda3` to the Environment Setup window when requesting the RStudio job. + 1. Add the command `module load Miniforge3` to the Environment Setup window when requesting the RStudio job. 1. If not already installed, install the `reticulate` package using either `install.packages` or the [renv](#rstudio-projects-and-renv) package. - 1. Use `reticulate::use_condaenv('env_name')` to load your conda environment. - 1. From here, you will be able to interact with all of the python packages and non-python precompiled binaries in your Anaconda environment using R and RStudio. Please read more about how to do that in [reticulate's documentation](https://rstudio.github.io/reticulate/#importing-python-modules). + 1. Use `reticulate::use_condaenv('ENVIRONMENT')` to load your conda environment which has the name `ENVIRONMENT`. + 1. From here, you will be able to interact with all of the python packages and non-python precompiled binaries in your `conda` environment using R and RStudio. Please read more about how to do that in [reticulate's documentation](https://rstudio.github.io/reticulate/#importing-python-modules). -For cases where your R code only needs access to precompiled binaries or libraries and does not need to import any Python libraries, you can instead create your Anaconda environment and add the following lines into the Environment Setup window: +For cases where your R code only needs access to precompiled binaries or libraries and does not need to import any Python libraries, you can instead create your `conda` environment and add the following lines into the Environment Setup window: ``` bash -module load Anaconda3 -conda activate +module load Miniforge3 +conda activate ENVIRONMENT ``` This will add those binaries and libraries to your environment `$PATH` which RStudio will inherit. diff --git a/docs/cheaha/slurm/gpu.md b/docs/cheaha/slurm/gpu.md index 316c96459..672c933d0 100644 --- a/docs/cheaha/slurm/gpu.md +++ b/docs/cheaha/slurm/gpu.md @@ -101,7 +101,7 @@ To check which CUDA Module version is required for your version of Tensorflow, s PyTorch does not maintain a simple compatibility table for CUDA versions. Instead, please manually check their ["get started" page](https://pytorch.org/get-started/locally/#start-locally) for the latest PyTorch version compatibility, and their ["previous versions" page](https://pytorch.org/get-started/previous-versions/) for older PyTorch version compatibility. Assume that a CUDA version is not compatible if it is not listed for a specific PyTorch version. -To use GPUs prior to PyTorch version 1.13 you _must_ select a `cudatoolkit` version from the PyTorch channel when you install PyTorch using Anaconda. It is how PyTorch knows to install a GPU compatible flavor, as opposed to the CPU only flavor. See below for templates of CPU and GPU installs for PyTorch versions prior to 1.13. Be sure to check the compatibility links above for your selected version. Note `torchaudio` is also available for signal processing. +To use GPUs prior to PyTorch version 1.13 you _must_ select a `cudatoolkit` version from the PyTorch channel when you install PyTorch using `conda`. It is how PyTorch knows to install a GPU compatible flavor, as opposed to the CPU only flavor. See below for templates of CPU and GPU installs for PyTorch versions prior to 1.13. Be sure to check the compatibility links above for your selected version. Note `torchaudio` is also available for signal processing. - CPU Version: `conda install pytorch==... torchvision==... -c pytorch` - GPU Version: `conda install pytorch==... torchvision==... cudatoolkit=... -c pytorch` diff --git a/docs/cheaha/slurm/slurm_tutorial.md b/docs/cheaha/slurm/slurm_tutorial.md index df27c93c8..68fb70615 100644 --- a/docs/cheaha/slurm/slurm_tutorial.md +++ b/docs/cheaha/slurm/slurm_tutorial.md @@ -118,15 +118,15 @@ This example illustrate a Slurm job that runs a Python script involving [NumPy]( #SBATCH --output=%x_%j.out ### Slurm Output file, %x is job name, %j is job id #SBATCH --error=%x_%j.err ### Slurm Error file, %x is job name, %j is job id -### Loading Anaconda3 module to activate `pytools-env` conda environment -module load Anaconda3 +### Loading Miniforge3 module to activate `pytools-env` conda environment +module load Miniforge3 conda activate pytools-env ### Run the script `python_test.py` python python_test.py ``` - The batch job requires an input file `python_test.py` (line 17) for execution. Copy the input file from the [Containers page](../../workflow_solutions/getting_containers.md/#create-your-own-docker-container). Place this file in the same folder as the `numpy.job`. This python script performs numerical integration and data visualization tasks, and it relies on the following packages: numpy, matplotlib, scipy for successful execution. These dependencies can be installed using [Anaconda](../../workflow_solutions/using_anaconda.md) within a `conda` environment named `pytools-env`. Prior to running the script, load the `Anaconda3` module and activate the `pytools-env` environment (line 13 and 14). Once job is successfully completed, check the slurm output file for results. Additionally, a plot named `testing.png` will be generated. + The batch job requires an input file `python_test.py` (line 17) for execution. Copy the input file from the [Containers page](../../workflow_solutions/getting_containers.md/#create-your-own-docker-container). Place this file in the same folder as the `numpy.job`. This python script performs numerical integration and data visualization tasks, and it relies on the following packages: numpy, matplotlib, scipy for successful execution. These dependencies can be installed using [`conda`](../../workflow_solutions/using_conda.md) within a `conda` environment named `pytools-env`. Prior to running the script, load the `Miniforge3` module and activate the `pytools-env` environment (line 13 and 14). Once job is successfully completed, check the slurm output file for results. Additionally, a plot named `testing.png` will be generated. ```bash $ ls @@ -174,8 +174,8 @@ Multiple jobs or tasks can be executed simultaneously using `srun` within a sing #SBATCH --output=%x_%j.out ### Slurm Output file, %x is job name, %j is job id #SBATCH --error=%x_%j.err ### Slurm Error file, %x is job name, %j is job id -### Loading Anaconda3 module to activate `pytools-env` conda environment -module load Anaconda3 +### Loading Miniforge3 module to activate `pytools-env` conda environment +module load Miniforge3 conda activate pytools-env ### Runs the script `python_test.py` in parallel with distinct inputs and ensures synchronization @@ -254,8 +254,8 @@ The following Slurm script is an example of how you might convert the previous ` #SBATCH --error=%x_%A_%a.err ### Slurm Error file, %x is job name, %A is array job id, %a is array job index #SBATCH --array=1-3 ### Number of Slurm array tasks, 3 tasks -### Loading Anaconda3 module to activate `pytools-env` conda environment -module load Anaconda3 +### Loading Miniforge3 module to activate `pytools-env` conda environment +module load Miniforge3 conda activate pytools-env ### Calculate the input range for each task @@ -392,7 +392,7 @@ $ sacct -j 27105035 ### Example 6: GPU Job -This slurm script shows the execution of Tensorflow job using GPU resources. Let us save this script as `gpu.job`. The Slurm parameter `--gres=gpu:2` in line 6, requests for 2 GPUs. In line 8, note that in order to run GPU-based jobs, either the `amperenodes` or `pascalnodes` partition must be used (please refer to our [GPU page](../slurm/gpu.md) for more information). Lines 14-15 loads the necessary CUDA modules, while lines 18-19 load the Anaconda module and activate a `conda` environment called `tensorflow`. Refer to [Tensorflow official page](https://www.tensorflow.org/) for installation. The last line executes a python script that utilizes Tensorflow library to perform matrix multiplication across multiple GPUs. +This slurm script shows the execution of Tensorflow job using GPU resources. Let us save this script as `gpu.job`. The Slurm parameter `--gres=gpu:2` in line 6, requests for 2 GPUs. In line 8, note that in order to run GPU-based jobs, either the `amperenodes` or `pascalnodes` partition must be used (please refer to our [GPU page](../slurm/gpu.md) for more information). Lines 14-15 loads the necessary CUDA modules, while lines 18-19 load the `Miniforge3` module and activate a `conda` environment called `tensorflow`. Refer to [Tensorflow official page](https://www.tensorflow.org/) for installation. The last line executes a python script that utilizes Tensorflow library to perform matrix multiplication across multiple GPUs. ```bash linenums="1" #!/bin/bash @@ -411,8 +411,8 @@ This slurm script shows the execution of Tensorflow job using GPU resources. Let module load CUDA/12.2.0 module load cuDNN/8.9.2.26-CUDA-12.2.0 -### Loading the Anaconda module and activating the `tensorflow` environment -module load Anaconda3 +### Loading the Miniforge3 module and activating the `tensorflow` environment +module load Miniforge3 conda activate tensorflow ### Executing the python script diff --git a/docs/cheaha/slurm/submitting_jobs.md b/docs/cheaha/slurm/submitting_jobs.md index fb9369f7d..f9b3fb399 100644 --- a/docs/cheaha/slurm/submitting_jobs.md +++ b/docs/cheaha/slurm/submitting_jobs.md @@ -149,7 +149,7 @@ For a practical example with dynamic indices, please visit our [Practical `sbatc Jobs should be submitted to the Slurm job scheduler either using a [batch job](#batch-jobs-with-sbatch) or an [Open OnDemand (OOD) interactive job](../open_ondemand/index.md). -You can use `srun` for working on short interactive tasks such as [creating an Anaconda environment](../../workflow_solutions/using_anaconda.md) and running [parallel tasks](#srun-for-running-parallel-jobs) within an sbatch script. +You can use `srun` for working on short interactive tasks such as [creating a `conda` environment](../../workflow_solutions/using_conda.md) and running [parallel tasks](#srun-for-running-parallel-jobs) within an sbatch script. !!! warning diff --git a/docs/cheaha/software/res/common_software.csv b/docs/cheaha/software/res/common_software.csv index 56556a7d4..5f66d20e5 100644 --- a/docs/cheaha/software/res/common_software.csv +++ b/docs/cheaha/software/res/common_software.csv @@ -1,9 +1,9 @@ -Name,Description -Anaconda3,"Software that can install the Python language, Python packages, and other research software. Learn more about using Anaconda at our [Anaconda on Cheaha page](software.md#anaconda-on-cheaha). You may be interested in our [OpenOnDemand Jupyter Notebook interactive app](../open_ondemand/ood_jupyter.md)." -"CUDA, cuDNN",Libraries for developing and using deep learning and AI models with NVidia GPUs. Commonly used with TensorFlow and PyTorch. See our [GPU page](../slurm/gpu.md) for more information. -"Mathematica",Mathematical CAS and numerical computing software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). -Matlab,Matlab language and development environment. We recommend using our [Open OnDemand Matlab interactive app](../open_ondemand/ood_matlab.md). -"R, Rstudio",R language and RStudio IDE. We recommend using our [Open OnDemand RStudio interactive app](../open_ondemand/ood_rstudio.md). -"SAS",Statistical analysis software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). -"Singularity",Software container engine. See our [Containers](../../workflow_solutions/getting_containers.md) page for more information. -"Stata",Statistical analysis software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). +Name ,Description +Miniforge3 ,"Software with `conda` that can install the Python language, Python packages, and other research software. Learn more about using `conda` at our [`conda` on Cheaha page](software.md#conda-on-cheaha). You may be interested in our [OpenOnDemand Jupyter Notebook interactive app](../open_ondemand/ood_jupyter.md)." +"CUDA, cuDNN" ,Libraries for developing and using deep learning and AI models with NVidia GPUs. Commonly used with TensorFlow and PyTorch. See our [GPU page](../slurm/gpu.md) for more information. +"Mathematica" ,Mathematical CAS and numerical computing software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). +Matlab ,Matlab language and development environment. We recommend using our [Open OnDemand Matlab interactive app](../open_ondemand/ood_matlab.md). +"R, Rstudio" ,R language and RStudio IDE. We recommend using our [Open OnDemand RStudio interactive app](../open_ondemand/ood_rstudio.md). +"SAS" ,Statistical analysis software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). +"Singularity" ,Software container engine. See our [Containers](../../workflow_solutions/getting_containers.md) page for more information. +"Stata" ,Statistical analysis software. Try our [Open OnDemand HPC Desktop interactive app](../open_ondemand/hpc_desktop.md). diff --git a/docs/cheaha/software/software.md b/docs/cheaha/software/software.md index 817473fc1..8c49e9fcf 100644 --- a/docs/cheaha/software/software.md +++ b/docs/cheaha/software/software.md @@ -1,52 +1,81 @@ # Software Installation -## Anaconda on Cheaha +## `conda` on Cheaha -For additional general information on using Anaconda please see [Anaconda Environments](../../workflow_solutions/using_anaconda.md). +For additional general information on using `conda` please see our [Using `conda` page](../../workflow_solutions/using_conda.md). -If you are using Jupyter Notebook, please see our section on [Packages for Jupyter](../../workflow_solutions/using_anaconda.md#packages-for-jupyter). +If you are using Jupyter Notebook, please see our section on [Packages for Jupyter](../../workflow_solutions/using_conda.md#packages-for-jupyter). -### Loading Anaconda +### Loading `conda` -Anaconda is installed on Cheaha as a family of modules, and does not need to be installed by Researchers. Instead, the most recent version of Anaconda installed on Cheaha may be loaded using the command `module load Anaconda3`. Other versions may be discovered using the command `module avail Anaconda`. We recommend always using the latest version. +`conda` is installed on Cheaha as a family of modules, and does not need to be installed by Researchers. Instead, the most recent version of `conda` installed on Cheaha may be loaded using the command `module load Miniforge3`. !!! note - If you are using [Open OnDemand Jupyter Notebook](../open_ondemand/ood_jupyter.md) you do not need to use the `module load` command as part of creating the job. + If you are using [Open OnDemand Jupyter Notebook](../open_ondemand/ood_jupyter.md) you should not use the `module load` command as part of creating the job. -### Using Anaconda +### Using `conda` -Anaconda on Cheaha works like it does on any other system, once the module has been loaded, with a couple of important differences in the callouts below. +Once you have loaded the Miniforge module, `conda` on Cheaha works similarly to how it does on other computers. There are a couple of important differences in the callouts below. !!! note - The `base` environment is installed in a shared location and cannot be modified by researchers. Other environments are installed in your home directory by default. + The `base` environment is installed in a shared location and cannot be modified by researchers. Other environments are installed in your home directory by default at `/home/$USER/.conda/`. !!! important - Only create environments on compute nodes. Anaconda environment creation consumes substantial resources and should not be run on the login node. + Only create `conda` environments on compute nodes. Environment creation consumes substantial resources and should not be run on the login node. !!! warning - The Cheaha operating system has a version of Python installed. This version is used by `python` calls when Anaconda has not been loaded. This can cause unexpected errors. Be sure you've loaded the Anaconda environment you need before using Python. + The Cheaha operating system has a built-in Python version installed. This version is used by `python` calls when Miniforge has not been loaded. This can cause unexpected errors. Be sure you've loaded the Miniforge module before using Python. !!! danger - Do not use `conda init` on Cheaha! Anaconda is managed as a [module](./modules.md), including script setup. Using `conda init` at any point can cause hard-to-diagnose issues with [Open OnDemand Interactive Jobs](../open_ondemand/ood_layout.md#interactive-apps). Please see [this ask.ci FAQ](https://ask.cyberinfrastructure.org/t/why-do-i-get-an-error-when-launching-an-open-ondemand-hpc-interactive-session/2496/2) for how to undo what `conda init` does. + Do not use `conda init` on Cheaha, even if prompted to do so! + + `conda` is managed on Cheaha via the [module](./modules.md) `Miniforge3`, including script setup. Using `conda init` at any point can cause hard-to-diagnose issues with [Open OnDemand Interactive Jobs](../open_ondemand/ood_layout.md#interactive-apps). Please see [this ask.ci FAQ](https://ask.cyberinfrastructure.org/t/why-do-i-get-an-error-when-launching-an-open-ondemand-hpc-interactive-session/2496/2) for how to undo what `conda init` does. + + If the `conda` software instructs you to use `conda init` while on Cheaha, please ignore it to avoid future issues with [Open OnDemand](../open_ondemand/index.md). + - If the Anaconda software instructs you to use `conda init` while on Cheaha, please ignore it to avoid future issues with [Open OnDemand](../open_ondemand/index.md). +!!! danger + + Using `pip install` without loading Miniforge3 will cause hard-to-diagnose errors and broken workflows. + + Using `pip install` in the `base` environment will cause the same hard-to-diagnose errors and broken workflows. + + Read more about this issue, and how to resolve it, [here](#installing-pip-packages-outside-of-your-environments). + + +For more information on usage with examples, see [`conda` Environments](../../workflow_solutions/using_conda.md). Need some hands-on experience? You can find instructions on how to install PyTorch and TensorFlow using `conda` in this [tutorial](../tutorial/pytorch_tensorflow.md). + +### Installing Pip Packages Outside of Your Environments + +When installing packages within a `conda` environment using `pip`, it's crucial to ensure that you install `pip` within the same conda environment and use `pip` from that environment. If `pip` is used outside of `conda` or within an environment without `pip` installed, the packages are installed to `~/.local`. This can lead to unexpected package conflicts, as Python loads packages from `~/.local` before loading from `conda` environments, and shows the following error, + +```bash +Requirement already satisfied: numpy in /home/$USER/.local/lib/python3.11/site-packages (1.26.3) +``` + +For the above case, resolving errors involve deleting the `~/.local` directory. + +Here's an example of the correct procedure for installing `pip` packages within a `conda` environment: -For more information on usage with examples, see [Anaconda Environments](../../workflow_solutions/using_anaconda.md). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using Anaconda in this [tutorial](../tutorial/pytorch_tensorflow.md). +1. Load the `Miniforge` module using `module load Miniforge3`. +1. Create or activate the desired `conda` environment. Please refer to the [`conda` documentation](../../workflow_solutions/using_conda.md#create-an-environment) +1. Install `pip` within the `conda` environment using `conda install pip` or `conda install python`. `pip` and `python` are packaged together, installing one will always install the other. +1. Use `pip` when this `conda` environment is active to install packages. Please refer to [Installing packages with `pip`](../../workflow_solutions/using_conda.md#installing-packages-with-pip) ## Singularity Containers diff --git a/docs/cheaha/tutorial/conda_good_practice.md b/docs/cheaha/tutorial/conda_good_practice.md new file mode 100644 index 000000000..1109f92f8 --- /dev/null +++ b/docs/cheaha/tutorial/conda_good_practice.md @@ -0,0 +1,109 @@ +# Good Practice for Finding `conda` Software Packages + +Finding `conda` software packages involves searching through the available channels and repositories to locate the specific packages that contain functions that you need for your environment. Channels instruct `conda` where to look for packages when installation is to be done. In the sections below, you will see information on how to locate packages important for your work, ensure the packages are up-to-date, figure out the best way to install them, and finally compose an environment file for portability and replicability. + +## Step-by-Step Guide to Finding `conda` Software Packages + +If we find the package at one of the channel sources mentioned above, we can check the Platform version to ensure it is either "noarch" (if available) or linux. After noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on the `conda-forge` channel or PyPI. If so, the package is being maintained on `conda-forge` or PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). We prefer searching using the following methods, and usually have the most success in the order listed below. + +- Using Google: You may already be familiar with the exact `conda` package name you require. In the event this is not the case, a simple web engine search with key words usually finds the package. For example, a web search for a `conda` package would be something along the lines of "conda package for `Generic Topic Name`". Your search results, should return popular package names related to the topic you have searched for. In the sections below, there is an attempt to provide a detailed step-by-step guide on how to find Conda packages using "numpy" as an example. + +- Conda-Forge: The conda-forge channel is the primary source for finding Conda packages while using Miniforge. You can visit [Conda-forge](https://conda-forge.org/packages/) and use the search bar to find the package you need. For example, when you get the package name from your web search (using numpy). You will enter name of the package in the search bar as shown below. But please take note to look for packages that show that package is available via the conda-forge channel. + +![!Landing page of conda-forge.org showing search](images/conda-forge_search.png) + +You may also search on the [Anaconda](https://anaconda.org) page. However, ensure you always use the package with the `conda-forge` Artifact. Enter the package name, then review results of your search, it is advised to use “Artifacts” that are compatible with the platform you are working with, as well as a package that has the most “Favorites” and “Downloads” numbers. Click on the portion that contains the name of the package (highlighted 3 in the image below). 1 highlights the Artifact, Favorite and Downloads numbers, the selection 2 highlights the Channel where this package is stored. + +Review results of your search, it is advised to use "Artifacts" that are compatible with the platform you are working with, as well as have the most "Favorites" and "Downloads" numbers. Click on the portion that contains the name of the package (highlighted 3 in the image below). 1 highlights the Artifact, Favorite and Downloads numbers, the selection 2 highlights the Channel where this package is stored. Always take note of this, as only packages installed from the `conda-forge` or `bioconda` channel are open-source and devoid of any usage restrictions. + +![!Anaconda.org page showing download statistics](images/anaconda_channel_package.png) + + +!!! important + + The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. The `conda-forge` and `bioconda` channels are free to use. + + +Follow the installation instructions you see in the image below. + +![!Anaconda.org page showing package installation instructions](images/install_anaconda_package.png) + +- Using the `conda` Search Command: You can use the `conda search ` command directly in your terminal to find packages. Replace `` with the package you would like to search for. To do this on Cheaha, make sure to `module load Miniforge3` first, and follow the instructions to [activate](../../workflow_solutions/using_conda.md#activate-an-environment) an environment. Then do `conda search numpy`. You should get a long list of numpy packages. Review this output, but take note of the highlighted portions in the image. The section with a red selection shows the numpy versions that are available, The section with a blue selection shows the channel where each numpy version is stored. Ensure you pick the stable versions that are associated with either `conda-forge` or `bioconda`. + +![!Search output from using conda search in Terminal](images/channel_conda_search.png) + +You can then install numpy with a specific version and from a specific channel with. + +```bash + conda install -c conda-forge numpy=2.0.0rc2 +``` + +!!! important + + The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. The `conda-forge` and `bioconda` channels are free to use. + + +- Using Specific channels: You can also get packages using specific `conda` channels listed below. + + - Conda-Forge: A community-driven channel with a wide variety of packages.Visit [Conda-Forge](https://conda-forge.org/) + + - Bioconda: A channel specifically for bioinformatics packages. Visit [Bioconda](https://bioconda.github.io/) + +You should specify a channel in your search, and it will show you a list of the packages available in that channel, using `conda search -c `, remember to replace and with the channel and package names you are searching for respectively. An example would be. + +```bash + conda search -c conda-forge numpy +``` + +If we find the package at one of these sources, we check the Platform version to ensure it is either noarch (if available) or linux for it to work on Cheaha ("noarch" is usually preferred for the sake of portability). Noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on `conda-forge` or PyPI. If so, the package is being maintained on `conda-forge` or PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). + +![!Github page for numpy, a `conda` package](images/github_conda_releases.png) + +If we don't find a package using Google, or the `conda-forge` and PyPI pages are out of date, then it may become very hard to use the software in a `conda` environment. It is possible to try installing a git repository using pip, but care must be taken to choose the right commit or tag. You can find more [info here](https://pip.pypa.io/en/stable/cli/pip_install/#examples). To search for a git repository try: + +1. github "name". +1. gitlab "name". + +Remember to replace name with name of `conda` package. + + +!!! note + +There are issues with out-of-date software. It may have bugs that have since been fixed and so makes for less reproducible science. Documentation may be harder to find if it isn't also matched to the software version. Examining the README.md file for instructions may provide some good information on installing the package. You can also reach out to us for [support](../../help/support.md) in installing a package. + + +When we have a complete list of `conda` packages and channels, then we can create an environment from scratch with all the dependencies included. For `conda` packages, add one line to dependencies for each software. For PyPI packages add - pip: under dependencies. Then under - pip:add `==` to pin the version, see below. The advantage to using an environment file is that it can be stored with your project in GitHub or GitLab, giving it all the benefits of [version control](../../workflow_solutions/git_collaboration.md). + +```yaml +name: test-env +dependencies: + - bioconda::methbat=0.13.2 # Pinned version from bioconda channel + - conda-forge::python=3.10.4 # Pinned version from conda-forge channel + - pip + - pip: + - numpy==1.26.4 # Pinned version for pip + - git+https://github.com/user/repo.git # Example of installing from a Git repo + - http://insert_package_link_here # For URL links +``` + + For git repos, add them under `- pip:` based on examples [here](https://pip.pypa.io/en/stable/cli/pip_install/#examples). See the section [Replicability versus Portability](../../workflow_solutions/using_conda.md#replicability-versus-portability) for more information. + +The above configuration is only for illustration purposes, to show how channels and dependencies can be used. It is best to install all of your packages from conda channels, to avoid your environment breaking. Only packages that are unavailable via conda, should be installed via pip. If you run into challenges please [contact us](../../index.md#how-to-contact-us). + +## Key Things To Remember + +1. Exploring Package Documentation: For each package, check the documentation to understand its features, version history, and compatibility. + +1. Regularly consider updating your environment file to manage dependencies and maintain compatible software environments. Also newer software tends to resolve older bugs, consequently improving the state of science. + +1. Verify Package Version and Maintenance: Ensure you are getting the latest version of the package that is compatible with your environment. Verify that the package is actively maintained by checking the source repository (e.g., GitHub, GitLab). Look for recent commits, releases, and issue resolutions. The concepts of version pinning and semantic versioning, explain this in detail. + +## Version Pinning + +Version pinning in `conda` environments involves specifying exact versions of packages to ensure consistency and compatibility. This practice is crucial for reproducibility, as it allows environments to be reproduced exactly, a critical component in research and collaborative projects. Version pinning also aids stability, by preventing unexpected changes that could break your environment, code or analysis. This practice also maintains compatibility between different packages that rely on specific dependencies. To implement version pinning, you can create a YAML file that lists the exact versions of all installed packages or specify versions directly when [creating](../../workflow_solutions/using_conda.md#create-an-environment) or updating environments using Conda commands. + +## Semantic Versioning + +[Semantic versioning](https://semver.org) is a versioning scheme using a three-part format (MAJOR.MINOR.PATCH) to convey the significance of changes in a software package. In `conda` environments, it plays a role in managing compatibility, version pinning, dependency resolution, and updating packages. The MAJOR version indicates incompatible API changes, i.e. same software package but operation and interaction are mostly different from what you are accustomed to in the previous version. The MINOR version adds backward-compatible functionality, i.e. same version of software package but now contains new features and functionality. Operations and interactions are still mostly the same. While PATCH version includes backward-compatible bug fixes, i.e. same major and minor versions now have a slight change, perhaps a bug or some small change, still same features, operations and interactions, just the minor bug fix. Using semantic versioning helps maintain consistency and compatibility by ensuring that updates within the same major version are compatible, and by allowing precise control when specifying package versions. + +In practice, updating a Major version of a package may break your workflow, but may increase software reliability, stability and fix bugs affecting your science. Changing the major version may also introduce new bugs, these concerns and some others are some of the tradeoffs that have to be taken into consideration. Semantic versioning helps with managing `conda` environments by facilitating precise [version pinning](#version-pinning) and dependency resolution. For instance, you can pin specific versions using Conda commands or specify version ranges to ensure compatibility as shown in the examples above. Semantic versioning also informs upgrade strategies, letting us know when to upgrade packages based on the potential impact of changes. By leveraging semantic versioning, you can maintain stable and consistent environments, which is essential for smooth research workflows. diff --git a/docs/workflow_solutions/images/anaconda_channel_package.png b/docs/cheaha/tutorial/images/anaconda_channel_package.png similarity index 100% rename from docs/workflow_solutions/images/anaconda_channel_package.png rename to docs/cheaha/tutorial/images/anaconda_channel_package.png diff --git a/docs/workflow_solutions/images/channel_conda_search.png b/docs/cheaha/tutorial/images/channel_conda_search.png similarity index 100% rename from docs/workflow_solutions/images/channel_conda_search.png rename to docs/cheaha/tutorial/images/channel_conda_search.png diff --git a/docs/cheaha/tutorial/images/conda-forge_search.png b/docs/cheaha/tutorial/images/conda-forge_search.png new file mode 100644 index 000000000..e5c26698f Binary files /dev/null and b/docs/cheaha/tutorial/images/conda-forge_search.png differ diff --git a/docs/workflow_solutions/images/github_conda_releases.png b/docs/cheaha/tutorial/images/github_conda_releases.png similarity index 100% rename from docs/workflow_solutions/images/github_conda_releases.png rename to docs/cheaha/tutorial/images/github_conda_releases.png diff --git a/docs/workflow_solutions/images/install_anaconda_package.png b/docs/cheaha/tutorial/images/install_anaconda_package.png similarity index 100% rename from docs/workflow_solutions/images/install_anaconda_package.png rename to docs/cheaha/tutorial/images/install_anaconda_package.png diff --git a/docs/cheaha/tutorial/index.md b/docs/cheaha/tutorial/index.md index 4dfff74ba..90467026c 100644 --- a/docs/cheaha/tutorial/index.md +++ b/docs/cheaha/tutorial/index.md @@ -1,9 +1,16 @@ -# Getting Started with Using Anaconda on Cheaha +# Cheaha Tutorials -Python is a high level programming language that is widely used in many branches of science. As a result, many scientific packages have been developed in Python, leading to the development of a package manager called Anaconda. Anaconda is a widely used Python package manager for scientific research. Consequently Anaconda is used on Cheaha for managing environments and packages. +## `conda` on Cheaha -Have you encountered problems while using Anaconda on Cheaha? We have provided this page to curate a number of walkthroughs on how you can address majority of the needs you may have or challenges you may experience using Anaconda on Cheaha. +Python is a high level programming language that is widely used in many branches of science. As a result, many scientific packages have been developed in Python, leading to the development of a package manager called [`conda`](../../workflow_solutions/using_conda.md). `conda` is a widely used Python package manager for scientific research. Consequently `conda` is used on Cheaha for managing environments and packages. -Below is a list of Tutorials we currently have Using Anaconda on Cheaha; +Have you encountered problems while using `conda` on Cheaha? We have provided this page to curate a number of walkthroughs on how you can address majority of the needs you may have or challenges you may experience using `conda` on [Cheaha](../getting_started.md). -1. Using PyTorch and TensorFlow with Anaconda on Cheaha, click [here.](../tutorial/pytorch_tensorflow.md) +1. Using `conda` to install and run PyTorch and TensorFlow: [link](../tutorial/pytorch_tensorflow.md). +1. Good Practice for Finding `conda` Software Packages: [link](../tutorial/conda_good_practice.md). + +## Using Slurm + +[Slurm](../slurm/introduction.md) is the job scheduler used on [Cheaha](../getting_started.md) that manages which work runs on which resources. Jobs are created when researchers interact with Slurm to request resources on which to run their research software. + +1. A tutorial on Parallel [Slurm](../slurm/introduction.md) workflows: [link](../slurm/slurm_tutorial.md). diff --git a/docs/cheaha/tutorial/pytorch_tensorflow.md b/docs/cheaha/tutorial/pytorch_tensorflow.md index 7294e8385..fcb494c59 100644 --- a/docs/cheaha/tutorial/pytorch_tensorflow.md +++ b/docs/cheaha/tutorial/pytorch_tensorflow.md @@ -1,8 +1,8 @@ -# Anaconda Environment Tutorial for PyTorch and TensorFlow +# `conda` Environment Tutorial for PyTorch and TensorFlow -The below tutorial would show you steps on how to create an Anaconda environment, activate, and install libraries/packages for machine and deep learning (PyTorch and Tensorflow) using an Anaconda environment on Cheaha. There are also steps on how to access the terminal, as well as using Jupyter Notebook's Graphical User Interface (GUI) to work with these Anaconda environments. There are detailed steps here to guide your creation of a [Jupyter Notebook job.](../open_ondemand/ood_layout.md#interactive-apps) +The below tutorial will show you how to create a `conda` environment, activate, and install libraries/packages for machine and deep learning (PyTorch and Tensorflow) using a `conda` environment on Cheaha. There are also steps on how to access the terminal, as well as using Jupyter Notebook's Graphical User Interface (GUI) to work with `conda` environments. There are detailed steps here to guide your creation of a [Jupyter Notebook job.](../open_ondemand/ood_layout.md#interactive-apps) -## Installing Anaconda Environments Using the Terminal +## Installing `conda` Environments Using the Terminal To access the terminal (shell), please do the following. @@ -32,11 +32,11 @@ The instructions below, provide a recommended step by step guide to creating and ## Installing PyTorch Using the Terminal -There are two instances of PyTorch that can be installed, one requiring GPUs, and another utilising only CPUs. GPUs generally improve project compute speeds and are preferred. For both instances of pytorch, please follow these steps; +There are two instances of PyTorch that can be installed, one requiring GPUs, and another utilizing only CPUs. GPUs generally improve project compute speeds and are preferred. For both instances of pytorch, please follow these steps; -1. [Create](../../workflow_solutions/using_anaconda.md#create-an-environment) and [activate](../../workflow_solutions/using_anaconda.md#activate-an-environment) an environment as stated in these links. +1. [Create](../../workflow_solutions/using_conda.md#create-an-environment) and [activate](../../workflow_solutions/using_conda.md#activate-an-environment) an environment as stated in these links. -1. Access the terminal following the steps [here](#installing-anaconda-environments-using-the-terminal). +1. Access the terminal following the steps [here](#installing-conda-environments-using-the-terminal). !!! note @@ -63,7 +63,7 @@ module load CUDA/11.8.0 ![!nvidia-smi output](images/CudaVersion.png) -When your job has been created and your environment created and activated from the terminal (see above [instructions](../../workflow_solutions/using_anaconda.md#create-an-environment)), run the below command. +When your job has been created and your environment created and activated from the terminal (see above [instructions](../../workflow_solutions/using_conda.md#create-an-environment)), run the below command. ```bash conda install pytorch torchvision torchaudio cudatoolkit=11.8 -c pytorch -c nvidia @@ -134,4 +134,4 @@ The image below shows an output that the TensorFlow library will utilize the ava The information (I) and warning (W) outputs notifies you of the installed Tensorflow binary and how it would function. The I output informs you that the installed Tensorflow library will utilize your CPU for additional speed when GPUs are not the most efficient way to do processing for these operations. The W output tells you TensorRT is not available, please note TensorRT is not currently supported on our systems. -Now that you have completed the tutorial, you can find more Anaconda information here, [Using Anaconda page](../../workflow_solutions/using_anaconda.md#anaconda). +Now that you have completed the tutorial, you can find more `conda` information at our [Using `conda` page](../../workflow_solutions/using_conda.md#why-use-conda). diff --git a/docs/contributing/contributor_guide.md b/docs/contributing/contributor_guide.md index 101cb0456..5829f690c 100644 --- a/docs/contributing/contributor_guide.md +++ b/docs/contributing/contributor_guide.md @@ -36,11 +36,11 @@ If you need assistance, please feel free to [contact us](../help/support.md). We understand that everyone has differing preferences when it comes to development environments, so please feel free to use the development environment of your choice. Please be aware that our content has been developed using VSCode and a collection of extensions, so the greatest level of support can be provided by us to you if you choose to use our tooling. -We are using Visual Studio Code (VSCode) for development with several extensions installed, listed below. The extensions are also in `.vscode/extensions.json` and should pop up as recommendations when you open this repository. We use VSCode for the productivity benefits related to local Anaconda environment management, git integration, and dynamic formatters and linting. Linting is provided by pre-commit hooks and in our Continuous Integration definitions. +We are using Visual Studio Code (VSCode) for development with several extensions installed, listed below. The extensions are also in `.vscode/extensions.json` and should pop up as recommendations when you open this repository. We use VSCode for the productivity benefits related to local `conda` environment management, git integration, and dynamic formatters and linting. Linting is provided by pre-commit hooks and in our Continuous Integration definitions. VSCode may be obtained from [Visual Studio Code](https://code.visualstudio.com/) and documentation is available at [VSCode: Docs](https://code.visualstudio.com/docs). The extensions should automatically show up as recommendations when opening the repo, or they can be downloaded using the VSCode Extensions menu (++ctrl+shift+x++ on Windows or ++command+shift+x++ on Mac). -We assume you have a `conda` distribution on your local machine. If you are affiliated with UAB, please install [Miniforge](https://conda-forge.org/miniforge/). For detailed installation instructions, see here: . For more information on using `conda`, see our [Anaconda page](../workflow_solutions/using_anaconda.md). +We assume you have a `conda` distribution on your local machine. If you are affiliated with UAB, please install [Miniforge](https://conda-forge.org/miniforge/) and _do not_ install Anaconda nor Miniconda. For more information on why, please see our [Conda Migration FAQ](../workflow_solutions/conda_migration_faq.md#why-do-i-need-to-stop-using-anaconda). For detailed installation instructions on installing Miniforge, see here: . For more information on using `conda`, see our [`conda` page](../workflow_solutions/using_conda.md). ### Style Guide @@ -163,7 +163,7 @@ You'll need to add, remove or otherwise modify files as appropriate to implement ##### Verify your changes -1. [Activate](../workflow_solutions/using_anaconda.md#activate-an-environment) your conda environment. +1. [Activate](../workflow_solutions/using_conda.md#activate-an-environment) your conda environment. 1. Open the file `test.py` in the repository to start the Python extension. 1. Select the interpreter using 1. Open a VSCode terminal using ++ctrl+shift+grave++. @@ -312,7 +312,7 @@ This workaround is needed because `markdownlint` has no plans to add support for - Main headings are based on [UAB Research Computing services](https://www.uab.edu/it/home/research-computing/research-digital-marketplace) - Favor placing new pages and information into an existing section over creating - Approach documentation from a problem solving angle rather than a technology. Examples: - - Section title "Installing Software Yourself with Anaconda" vs "Anaconda" + - Section title "Installing Software Yourself with `conda`" vs "`conda`" - Section title "Running Analysis Jobs" vs "Slurm" - Put redirects for any page moves in case someone has bookmarked a page (see Redirect section below) diff --git a/docs/data_management/lts/interfaces.md b/docs/data_management/lts/interfaces.md index 8f8f854c1..68e15cc61 100644 --- a/docs/data_management/lts/interfaces.md +++ b/docs/data_management/lts/interfaces.md @@ -35,16 +35,16 @@ While globus is the recommended tool for most data transfers, command line tools ### Installation of `s3cmd` and `s5cmd` on Cheaha -To install the tools on Cheaha, you can request a compute node through Cheaha's [Open OnDemand web portal](../../cheaha/open_ondemand/ood_layout.md/#creating-an-interactive-job).Once your job is launched, open a terminal to execute the commands listed below. You do not need to install both tools if they aren't necessary. Both are available to install into [Anaconda](../../workflow_solutions/using_anaconda.md) environments. It's suggested to create a single environment named `s3` and install both s3cmd and s5cmd into it for easy access to both tools. Specific install and usage commands for each are given in their respective sections. You can create the general environment using the following commands: +To install the tools on Cheaha, you can request a compute node through Cheaha's [Open OnDemand web portal](../../cheaha/open_ondemand/ood_layout.md/#creating-an-interactive-job).Once your job is launched, open a terminal to execute the commands listed below. You do not need to install both tools if they aren't necessary. Both are available to install into [`conda`](../../workflow_solutions/using_conda.md) environments. It's suggested to create a single environment named `s3` and install both s3cmd and s5cmd into it for easy access to both tools. Specific install and usage commands for each are given in their respective sections. You can create the general environment using the following commands: ``` bash -module load Anaconda3 +module load Miniforge3 conda create -n s3 -c conda-forge pip s5cmd conda activate s3 pip install s3cmd ``` -Please note that the instructions mentioned above are specific to the Cheaha system. To transfer data between your personal computer and LTS, you will need to install `s3cmd` or `s5cmd` on your machine. Please refer to this [section](#installation-of-s3cmd-and-s5cmd-on-personal-systems-without-anaconda) for installation instructions specific to your operating system. +Please note that the instructions mentioned above are specific to the Cheaha system. To transfer data between your personal computer and LTS, you will need to install `s3cmd` or `s5cmd` on your machine. Please refer to this [section](#installation-of-s3cmd-and-s5cmd-on-personal-systems-without-conda) for installation instructions specific to your operating system. !!! note @@ -54,7 +54,7 @@ Please note that the instructions mentioned above are specific to the Cheaha sys ### s3cmd -s3cmd is a tool used for managing buckets and objects in Amazon S3 (Simple Storage Service). s3cmd is our suggested tool for operations such as listing buckets, managing bucket permissions, synchronizing directories with s3 buckets, and for small periodic file transfers. If high-speed transfer of a large files is required, we recommend using [s5cmd](#s5cmd). See the [preceding section](#command-line) for instructions on how to install both it and s5cmd into an Anaconda environment. +s3cmd is a tool used for managing buckets and objects in Amazon S3 (Simple Storage Service). s3cmd is our suggested tool for operations such as listing buckets, managing bucket permissions, synchronizing directories with s3 buckets, and for small periodic file transfers. If high-speed transfer of a large files is required, we recommend using [s5cmd](#s5cmd). See the [preceding section](#command-line) for instructions on how to install both it and s5cmd into a `conda` environment. #### Configuring s3cmd @@ -178,7 +178,7 @@ s3cmd info s3:// ### s5cmd -s5cmd is a parallel transfer tool suggested for period transfers of large and/or many files at a time. It has options for customizing how many processors are available for transferring data as well as how many chunks files can be broken into during transfer to minimize transfer time. See the [preceding section](#command-line) for instructions on how to install both it and s3cmd into an Anaconda environment +s5cmd is a parallel transfer tool suggested for period transfers of large and/or many files at a time. It has options for customizing how many processors are available for transferring data as well as how many chunks files can be broken into during transfer to minimize transfer time. See the [preceding section](#command-line) for instructions on how to install both it and s3cmd into a `conda` environment. #### Configuring s5cmd @@ -249,9 +249,9 @@ It's important to note that the main functionality of s5cmd over s3cmd is the pa When setting the value for `--numworkers`, do not select a value beyond the number of CPUs you have requested for your job! This can cause high context switching (meaning individual CPUs are switching between multiple running processes) which can affect job performance for all jobs on a node. -### Installation of `s3cmd` and `s5cmd` on Personal Systems without Anaconda +### Installation of `s3cmd` and `s5cmd` on Personal Systems without `conda` -The installation instructions and software dependencies may differ depending on the operating system being used. Following are the installation instructions tested for different operating systems. You may also use [Anaconda](../../workflow_solutions/using_anaconda.md) to install either or both packages. +The installation instructions and software dependencies may differ depending on the operating system being used. Following are the installation instructions tested for different operating systems. You may also use [`conda`](../../workflow_solutions/using_conda.md) to install either or both packages. #### Ubuntu diff --git a/docs/data_management/lts/tutorial/individual_lts_tutorial.md b/docs/data_management/lts/tutorial/individual_lts_tutorial.md index 4e53d806c..94697b5b7 100644 --- a/docs/data_management/lts/tutorial/individual_lts_tutorial.md +++ b/docs/data_management/lts/tutorial/individual_lts_tutorial.md @@ -7,7 +7,7 @@ In this tutorial, we will guide you through using `s3cmd` on the Cheaha system t ## Prerequisites -To get up to speed, you should have a basic understanding of how to use the shell/terminal. If you’re not familiar with these concepts, we recommend checking out our [learning resources on basic shell usage](../../../workflow_solutions/shell.md/#shell-reference). +To get up to speed, you should have a basic understanding of how to use the shell/terminal. If you’re not familiar with these concepts, we recommend checking out our [learning resources on basic shell usage](../../../workflow_solutions/shell.md#shell-reference). You will also need an individual LTS account created by our team. If you believe you need an account but do not have one, please [contact us](../../../index.md/#how-to-contact-us). @@ -15,13 +15,13 @@ You will also need an individual LTS account created by our team. If you believe ### Install s3cmd within Conda Environment on Cheaha -To interact with LTS (Long-Term Storage) using [S3 (Simple Storage Service)](https://aws.amazon.com/s3/), you need the `s3cmd` tool installed.[`s3cmd`](https://s3tools.org/s3cmd) is a command-line tool for managing files in cloud storage systems like S3. It's recommended to install it using `pip`, the standard package installer for Python, which allows you to install packages from the [Python Package Index (PyPI)](https://pypi.org/), within a [Conda environment](../../../workflow_solutions/using_anaconda.md/#create-an-environment) on Cheaha. +To interact with LTS (Long-Term Storage) using [S3 (Simple Storage Service)](https://aws.amazon.com/s3/), you need the `s3cmd` tool installed.[`s3cmd`](https://s3tools.org/s3cmd) is a command-line tool for managing files in cloud storage systems like S3. It's recommended to install it using `pip`, the standard package installer for Python, which allows you to install packages from the [Python Package Index (PyPI)](https://pypi.org/), within a [Conda environment](../../../workflow_solutions/using_conda.md#create-an-environment) on Cheaha. Please avoid using `conda install s3cmd`, as that version will not work as expected. Instead, follow the steps below to install `s3cmd` using `pip` within your Conda environment. -First, access our interactive Open OnDemand (OOD) portal at [https://rc.uab.edu](https://rc.uab.edu) and create a job on Cheaha using one of our interactive applications. For guidance, refer to our tutorial on [installing and setting Conda environment](../../../cheaha/tutorial/pytorch_tensorflow.md/#installing-anaconda-environments-using-the-terminal). +First, access our interactive Open OnDemand (OOD) portal at [https://rc.uab.edu](https://rc.uab.edu) and create a job on Cheaha using one of our interactive applications. For guidance, refer to our tutorial on [installing and setting Conda environment](../../../cheaha/tutorial/pytorch_tensorflow.md#installing-conda-environments-using-the-terminal). -Once your interactive apps session is launched, open the terminal as described in [step 5 of the Anaconda tutorial page](../../../cheaha/tutorial/pytorch_tensorflow.md/#installing-anaconda-environments-using-the-terminal) and run the below commands. +Once your interactive apps session is launched, open the terminal as described in [step 5 of the Anaconda tutorial page](../../../cheaha/tutorial/pytorch_tensorflow.md#installing-conda-environments-using-the-terminal) and run the below commands. ```bash module load Anaconda3 @@ -36,29 +36,29 @@ Once these steps are completed, verify the installation by running `pip list | g ### Install s3cmd on Your Local Systems -To install s3cmd on your local machine, please follow the instructions provided in [our s3cmd documentation for local installation](../../../data_management/lts/interfaces.md/#installation-of-s3cmd-and-s5cmd-on-personal-systems-without-anaconda). +To install s3cmd on your local machine, please follow the instructions provided in [our s3cmd documentation for local installation](../../../data_management/lts/interfaces.md#installation-of-s3cmd-and-s5cmd-on-personal-systems-without-conda). ### Configuring s3cmd for LTS Buckets Properly configuring `s3cmd` is important for working with LTS buckets and objects. The configuration process varies depending on whether you have a single LTS account or multiple accounts to manage. In this section, we will provide a step-by-step guide tailored specifically for the **Cheaha** system and a researcher with an **individual LTS account**. -Open a terminal using one of the interactive apps on Cheaha. Activate your conda environment created in the [Install s3cmd using within Conda Environment](./individual_lts_tutorial.md/#install-s3cmd-within-conda-environment-on-cheaha) section, and then run the below command: +Open a terminal using one of the interactive apps on Cheaha. Activate your conda environment created in the [Install s3cmd using within Conda Environment](./individual_lts_tutorial.md#install-s3cmd-within-conda-environment-on-cheaha) section, and then run the below command: ```bash s3cmd --configure ``` -This will prompt you to enter the access key and secret key associated with your individual LTS account. You will be asked for additional information, which will be displayed on the screen, as shown below. You can copy the necessary details from the example provided [here](../interfaces.md/#configuring-s3cmd). +This will prompt you to enter the access key and secret key associated with your individual LTS account. You will be asked for additional information, which will be displayed on the screen, as shown below. You can copy the necessary details from the example provided [here](../interfaces.md#configuring-s3cmd). ![image-s3cmd](../images/config-s3cmd.png) -Once the configuration is complete, `s3cmd` will generate a `.s3cfg` file in your home directory (`$HOME`), as shown below. To find your home directory in Cheaha and view the `.s3cfg` file, follow the instructions on our [Navigating Open OnDemand](../../../cheaha/open_ondemand/ood_layout.md/#navigating-open-ondemand) page. Be sure to check the **Show Dotfiles** option in the top right corner to make hidden files visible. +Once the configuration is complete, `s3cmd` will generate a `.s3cfg` file in your home directory (`$HOME`), as shown below. To find your home directory in Cheaha and view the `.s3cfg` file, follow the instructions on our [Navigating Open OnDemand](../../../cheaha/open_ondemand/ood_layout.md#navigating-open-ondemand) page. Be sure to check the **Show Dotfiles** option in the top right corner to make hidden files visible. ![config-file](../images/s3cfg.png) ### Creating Buckets -Long Term Storage (LTS) services like Amazon S3 use a flat data organization model based on **buckets** and **objects**. Think of buckets as folders that contain individual pieces of data called objects. We have documentation about basic terminology on s3 storage system [here](../index.md/#terminology). +Long Term Storage (LTS) services like Amazon S3 use a flat data organization model based on **buckets** and **objects**. Think of buckets as folders that contain individual pieces of data called objects. We have documentation about basic terminology on s3 storage system [here](../index.md#terminology). Once you have complete `s3cmd` configuration, you can create new buckets in your individual LTS storage. To create a bucket use a `mb` (make bucket) command: @@ -70,9 +70,9 @@ Please replace `your-bucket-name` with your desired name. This command creates a ![image-bucket](../images/create-bucket.png) -When creating a bucket, it is important to be aware of name uniqueness and naming conventions. For detailed information on bucket naming, please refer to our documentation on [valid bucket names in LTS](../lts_faq.md) and [avoiding duplicate names for buckets](../index.md/#avoiding-duplicate-names-for-buckets). +When creating a bucket, it is important to be aware of name uniqueness and naming conventions. For detailed information on bucket naming, please refer to our documentation on [valid bucket names in LTS](../lts_faq.md) and [avoiding duplicate names for buckets](../index.md#avoiding-duplicate-names-for-buckets). -If you try to create a bucket with `s3cmd mb` with the name that already exists within your namespace, the system will report success without making any changes. For example, if you run `s3cmd mb s3://existing-bucket-name` and that bucket name is already taken in your namespace, the command will complete successfully without creating a new bucket and showing an error. However, if you try to create a bucket with a name that is already used in someone else’s namespace, you will receive a `409 (BucketAlreadyExists)` error. To create a unique bucket within your namespace, first use `s3cmd ls` to list existing buckets and choose a name that is not already in use. To avoid the `409 (BucketAlreadyExists)` error and ensure your bucket name is unique, follow the [avoiding duplicate names for buckets](../index.md/#avoiding-duplicate-names-for-buckets) guide. This will help you create your bucket successfully and maintain its uniqueness. +If you try to create a bucket with `s3cmd mb` with the name that already exists within your namespace, the system will report success without making any changes. For example, if you run `s3cmd mb s3://existing-bucket-name` and that bucket name is already taken in your namespace, the command will complete successfully without creating a new bucket and showing an error. However, if you try to create a bucket with a name that is already used in someone else’s namespace, you will receive a `409 (BucketAlreadyExists)` error. To create a unique bucket within your namespace, first use `s3cmd ls` to list existing buckets and choose a name that is not already in use. To avoid the `409 (BucketAlreadyExists)` error and ensure your bucket name is unique, follow the [avoiding duplicate names for buckets](../index.md#avoiding-duplicate-names-for-buckets) guide. This will help you create your bucket successfully and maintain its uniqueness. ### Managing Buckets @@ -97,7 +97,7 @@ To manage a bucket, various commands can be used. Below are some common `s3cmd` Deleting objects and buckets cannot be undone. Once the delete command is entered, any data is lost permanently and cannot be restored. -You can find a variety of `s3cmd` commands in our documentation at [here](../../lts/interfaces.md/#s3cmd-commands) and on the [S3tools website](https://s3tools.org/usage). For quick reference, you can also use the `s3cmd --help` command to view available options directly in your terminal. +You can find a variety of `s3cmd` commands in our documentation at [here](../../lts/interfaces.md#s3cmd-commands) and on the [S3tools website](https://s3tools.org/usage). For quick reference, you can also use the `s3cmd --help` command to view available options directly in your terminal. If you are continuing in the same session with your **conda environment already activated**, you can directly use the `s3cmd` commands. If you are starting a new session or returning at a later date, make sure to load the Anaconda module and activate your conda environment before using `s3cmd`. @@ -105,7 +105,7 @@ If you are continuing in the same session with your **conda environment already Managing access to your buckets is essential for both collaboration and security. By setting up specific [bucket policies](https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-policies.html), you can control who can view or modify your bucket’s contents. Follow these steps to grant access: -- Create a policy file: define a policy and save it as a `JSON` file. For guidance and details on creating and formatting policy files, refer to our [create a policy structure guide](../policies.md/#policy-structure). For example, you might create a policy file named `my_policy.json` with read permissions. +- Create a policy file: define a policy and save it as a `JSON` file. For guidance and details on creating and formatting policy files, refer to our [create a policy structure guide](../policies.md#policy-structure). For example, you might create a policy file named `my_policy.json` with read permissions. - Apply the policy: Use the command like `s3cmd setpolicy policy_file.json s3://your-bucket-name` to apply your defined read policy to your bucket. Replace `policy_file.json` with the name of your policy file and `your-bucket-name` with the name of your bucket. - Verify the policy update: After applying the policy, you should see a `Policy updated` message if the operation was successful. You can also verify the applied policy by running: `s3cmd info s3://your-bucket-name`. @@ -117,10 +117,10 @@ Please note that the permissions granted are determined by the settings defined - **Read-only Access** - To allow another account to view and copy files from your bucket without making any changes, use the [read only permission policy](../policies.md/#read-only-for-all-files). + To allow another account to view and copy files from your bucket without making any changes, use the [read only permission policy](../policies.md#read-only-for-all-files). - **Read/Write Access** - To grant another account the ability to both view and modify the contents of your bucket, use the [read/write permissions policy](../policies.md/#read-write-permissions). + To grant another account the ability to both view and modify the contents of your bucket, use the [read/write permissions policy](../policies.md#read-write-permissions). -For detailed information on LTS bucket policies and instructions on how to apply and remove bucket policies, please refer to our [policy structure](../policies.md/#policy-structure) and [apply bucket policy](../policies.md/#applying-a-policy) guides. +For detailed information on LTS bucket policies and instructions on how to apply and remove bucket policies, please refer to our [policy structure](../policies.md#policy-structure) and [apply bucket policy](../policies.md#applying-a-policy) guides. diff --git a/docs/data_management/storage.md b/docs/data_management/storage.md index 18ad81562..ccc81b521 100644 --- a/docs/data_management/storage.md +++ b/docs/data_management/storage.md @@ -122,7 +122,7 @@ If you wish to discuss other alternatives tailored to your workflow, please [Con ## User Data and Home Directories -Every user of Cheaha are given a storage space to store general data and data that can be used during active analysis. While there are no data retention policies in place, these spaces are not intended for long-term storage of data that changes infrequently. Traditionally, `$HOME` is intended to store scripts, supporting files, software configuration files, and toolboxes such as Anaconda virtual environments or R packages. In contrast, `$USER_DATA` is intended to store datasets and results for individual research projects, with access granted only to the user of that directory. Since the quotas for these directories are limited to 5TB, you may consider using [scratch](#scratch) space and/or [project directories](#project-directory) for storing, moving, and analyzing data. +Every user of Cheaha are given a storage space to store general data and data that can be used during active analysis. While there are no data retention policies in place, these spaces are not intended for long-term storage of data that changes infrequently. Traditionally, `$HOME` is intended to store scripts, supporting files, software configuration files, and toolboxes such as `conda` environments or R packages. In contrast, `$USER_DATA` is intended to store datasets and results for individual research projects, with access granted only to the user of that directory. Since the quotas for these directories are limited to 5TB, you may consider using [scratch](#scratch) space and/or [project directories](#project-directory) for storing, moving, and analyzing data. ## Project Directory diff --git a/docs/education/case_studies.md b/docs/education/case_studies.md index 558a14878..a03cb8a01 100644 --- a/docs/education/case_studies.md +++ b/docs/education/case_studies.md @@ -48,7 +48,7 @@ To install Parabricks using Singulairty, load the `Singularity 3.x` module from module load Singularity/3.5.2-GCC-5.4.0-2.26 ``` -Go to the NGC catalog page and copy the image path to pull the desired containers of Parabricks using Singularity. Here, the generic container is pulled using Singularity. The image path is in “nvcr.io/nvidia/clara/clara-parabricks" and the tag is 4.2.0-1. The container image name `parabricks-4.2.0-1.sif` is an user-derived name. +Go to the NGC catalog page and copy the image path to pull the desired containers of Parabricks using Singularity. Here, the generic container is pulled using Singularity. The image path is in "nvcr.io/nvidia/clara/clara-parabricks" and the tag is 4.2.0-1. The container image name `parabricks-4.2.0-1.sif` is an user-derived name. ![!Parabricks container.](./images/parabricks_container.png) diff --git a/docs/grants/res/uab-rc-facilities.txt b/docs/grants/res/uab-rc-facilities.txt index 682b00e10..b304f8129 100644 --- a/docs/grants/res/uab-rc-facilities.txt +++ b/docs/grants/res/uab-rc-facilities.txt @@ -20,11 +20,11 @@ Cheaha provides researchers with both a web-based interface, via open OnDemand, HIGH-PERFORMANCE COMPUTING (Cheaha) SOFTWARE TOOLS -General research computing and scientific programming software are available on Cheaha, including Anaconda, R and RStudio, and MATLAB through the Lmod environment module system. RStudio, MATLAB, Jupyter Notebook server, and Jupyter Lab are all available on our Open OnDemand web portal as interactive applications, along with a general-use desktop environment via no-VNC, directly in the browser. Researchers are enabled to develop and share their own custom interactive applications through a sandbox application feature within Open OnDemand. +General research computing and scientific programming software are available on Cheaha, including conda, R and RStudio, and MATLAB through the Lmod environment module system. RStudio, MATLAB, Jupyter Notebook server, and Jupyter Lab are all available on our Open OnDemand web portal as interactive applications, along with a general-use desktop environment via no-VNC, directly in the browser. Researchers are enabled to develop and share their own custom interactive applications through a sandbox application feature within Open OnDemand. The UAB Center for Clinical and Translational Science (CCTS) Informatics group has installed and supports a variety of bioinformatics tools that are available to be run from Cheaha. Standalone packages are available for quality control (fastQC, Picard Tools), alignment (Abyss, Velvet, BWA, Bowtie) visualization (IGV), variant calling (GATK, SnpEff, annoVar), RNAseq (Cufflinks, Cuffdiff, TopHat) and microbiome and metagenomic analysis (QIIME, HUMAnN, MEGAN). -Additional scientific domain-specific software is also available, including Relion for cryo-electron microscopy analysis, AFNI for fMRI analysis, and ANSYS for simulations for research efforts of the UAB School of Engineering. Many other software packages are installed and maintained, and we encourage and facilitate researchers installing their own additional software using Anaconda, R and MATLAB package management where possible. +Additional scientific domain-specific software is also available, including Relion for cryo-electron microscopy analysis, AFNI for fMRI analysis, and ANSYS for simulations for research efforts of the UAB School of Engineering. Many other software packages are installed and maintained, and we encourage and facilitate researchers installing their own additional software using conda, R and MATLAB package management where possible. diff --git a/docs/help/support.md b/docs/help/support.md index 7b5fef746..3ea7a318f 100644 --- a/docs/help/support.md +++ b/docs/help/support.md @@ -74,7 +74,7 @@ Please see our [Storage page](../data_management/storage.md) for more informatio ## How do I request new software installed? -Before making a request for new software on Cheaha, please try searching our [modules](../cheaha/software/modules.md) or searching for packages on [Anaconda](../workflow_solutions/using_anaconda.md). +Before making a request for new software on Cheaha, please try searching our [modules](../cheaha/software/modules.md) or searching for packages on [`conda`](../workflow_solutions/using_conda.md). If you are not able to find a suitable module or package and would like software installed on Cheaha, please [create a ticket](#how-do-i-create-a-support-ticket) with the name of the software, the version number, and a link to the installation instructions. diff --git a/docs/index.md b/docs/index.md index 6f7fb031e..bb6dfdc12 100644 --- a/docs/index.md +++ b/docs/index.md @@ -7,13 +7,11 @@ The Research Computing System (RCS) provides a framework for sharing research da - _Application Development_: providing virtual machines and web-hosted development tools empowering researcher via our [cloud.rc](uab_cloud/index.md) fabric. -!!! announcement +!!! important - We have released new A100 gpus on Cheaha! For more information please see [GPUs](cheaha/slurm/gpu.md). + Anaconda has changed its [Terms of Service]. As a consequence, UAB employees are no longer allowed to use the Anaconda Distribution and Anaconda channels within the `conda` software. We are in the process of replacing Anaconda on Cheaha with Miniforge. - We have released new CUDA and cuDNN modules! For more information please see [CUDA Modules](cheaha/slurm/gpu.md#cuda-modules). - - Also see our [A100 GPU Frequently Asked Questions (FAQ)](cheaha/slurm/gpu.md#frequently-asked-questions-faq-about-a100-gpus) + Read more about how this may affect you at our [Conda Migration FAQ](workflow_solutions/conda_migration_faq.md). ## How to Contact Us diff --git a/docs/uab_cloud/installing_software.md b/docs/uab_cloud/installing_software.md index d2ff4f64a..ff6916853 100644 --- a/docs/uab_cloud/installing_software.md +++ b/docs/uab_cloud/installing_software.md @@ -42,7 +42,7 @@ Most common software packages and NVIDIA drivers are available as `apt` packages If the software is available via `apt` then use `sudo apt install `. An example would be `sudo apt install git` to install git software. -If the software uses a custom installer, then follow the instructions provided by the software's documentation. An example would be [Miniconda](#installing-miniconda), where a shell script is downloaded and then executed using `bash installer.sh`. +If the software uses a custom installer, then follow the instructions provided by the software's documentation. An example would be [Miniforge](#installing-conda-via-miniforge), where a shell script is downloaded and then executed using `bash installer.sh`. ### Installing Server Software @@ -137,13 +137,17 @@ Below are a few examples of installing certain common softwares that may be usef 1. Find the line with "recommended" and install the package on that line with `sudo apt install nvidia-driver-###` 1. Reboot the instance -#### Installing Miniconda +#### Installing `conda` via Miniforge -Miniconda is a lightweight version of Anaconda. While Anaconda's base environment comes with Python, the Scipy stack, and other common packages pre-installed, Miniconda comes with no packages installed. This is an excellent alternative to the full Anaconda installation for environments where minimal space is available or where setup time is important. We recommend installing [Miniconda](https://docs.conda.io/en/latest/miniconda.html) on cloud.rc instances, as opposed to Anaconda, to conserve storage space. For more information on how to use Anaconda see the [Using Anaconda](../workflow_solutions/using_anaconda.md#using-anaconda). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using Anaconda in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). +Miniforge is a free and open-source (FOSS) version same as Anaconda or Miniconda. If you are a UAB employee, do not use Anaconda or Miniconda due to recent changes. See our [Conda Migration FAQ](../workflow_solutions/conda_migration_faq.md) to understand why. + +For more information on how to use `conda` see the [Using `conda` page](../workflow_solutions/using_conda.md#using-conda). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using `conda` in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). + +To install Miniforge in your [instance](tutorial/instances.md) 1. Run the commands in [Before Installing Software](#before-installing-software). -1. `wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh` -1. `bash Miniconda3-latest-Linux-x86_64.sh` +1. `wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh` +1. `bash Miniforge3-Linux-x86_64.sh` #### Installing Singularity @@ -207,9 +211,9 @@ To install, you will need the following pre-requisites. If you are unfamiliar wi 1. Run the commands in [Before Installing Software](#before-installing-software). 1. A [Cloud Instance](tutorial/instances.md) with attached [Floating IP]network_setup_basic.md#floating-ips). 1. A [Security Group](tutorial/security.md#creating-a-security-group) for the intended Jupyter Server port. For the purposes of this tutorial, the port will be set to `9999`. -1. [Miniconda installed](#installing-miniconda) on the instance. Miniconda is a lightweight version of Anaconda. +1. [`conda` installed](#installing-conda-via-miniforge) on the instance. -Once the prerequisites are complete, the following steps must be performed to install and setup Jupyter Notebook Server. It is highly recommended to build an [Anaconda Environment](../workflow_solutions/using_anaconda.md#create-an-environment) using a reproducible [Environment File](../workflow_solutions/using_anaconda.md#creating-an-environment-from-a-yaml-file). The steps below belong to the official Jupyter documentation available at . +Once the prerequisites are complete, the following steps must be performed to install and setup Jupyter Notebook Server. It is highly recommended to build a [`conda` Environment](../workflow_solutions/using_conda.md#create-an-environment) using a reproducible [Environment File](../workflow_solutions/using_conda.md#creating-an-environment-from-a-yaml-file). The steps below belong to the official Jupyter documentation available at . !!! warning @@ -217,16 +221,13 @@ Once the prerequisites are complete, the following steps must be performed to in Leaving your Jupyter Notebook Server unsecured may mean that other people on the UAB Campus Network are able to access your notebooks and other files stored on that cloud instance. -1. [Install](../workflow_solutions/using_anaconda.md#install-packages) Jupyter Notebook Server using [Miniconda](../workflow_solutions/using_anaconda.md). You will need the following packages. +1. [Install](../workflow_solutions/using_conda.md#install-packages) Jupyter Notebook Server using [`conda`](../workflow_solutions/using_conda.md). You will need the following packages. - - `conda-forge` channel - - `notebook` - - `nb_conda_kernels` - - [Optional] `jupyter_contrib_nbextensions` - - `anaconda` channel - - `ipykernel` for python users - - `r-irkernel` for R users - - [Optional] `pip` + - `notebook` + - `nb_conda_kernels` + - `ipykernel` for python users + - `r-irkernel` for R users + - [Optional] `jupyter_contrib_nbextensions` 1. Because floating IPs are, by default, reachable by anyone on the UAB Campus Network, you'll need to secure the server using the steps below. 1. Generate a notebook config file using `jupyter notebook --generate-config`. [[official docs](https://jupyter-server.readthedocs.io/en/stable/operators/public-server.html#prerequisite-a-jupyter-server-configuration-file)] diff --git a/docs/workflow_solutions/conda_migration_faq.md b/docs/workflow_solutions/conda_migration_faq.md new file mode 100644 index 000000000..ed81a4931 --- /dev/null +++ b/docs/workflow_solutions/conda_migration_faq.md @@ -0,0 +1,86 @@ +# `conda` Migration FAQ + +## Why do I need to stop using Anaconda? + +In April, 2020, Anaconda changed from a free-for-everyone licensing model to a free-for-some licensing model. At that time, Anaconda was free to use by individuals for personal use, non-profit organizations of any size (including UAB), and for-profit organizations up to 200 employees. + +In March, 2024, Anaconda further restricted its licensing model. Anaconda is now free to use only for individuals for personal use, organizations up to 200 employees, and non-profit educational organizations when used by instructors and students in a curriculum-based course. + +Use of Anaconda by UAB employees for research purposes violates the Anaconda Terms of Service. + +## What counts as "Use of Anaconda"? + +- Downloading and installing Anaconda Software Distributions, including `anaconda` and `miniconda`. +- Using the `defaults`, `anaconda`, and `r` channels for packages. +- Using Anaconda Navigator. + +Using the `conda` executable does not violate the terms of service, provided it is not used to access the channels listed above. + +## What is changing on Cheaha? + +We have installed Minforge as a module. To use it, **run`module load Miniforge3`** wherever you would have used `module load Anaconda3`. At a future date, we plan to archive old `Anaconda3` modules and alias the most recent on to the `Miniforge3`. When that has been completed, `module load Anaconda3` will emit a warning and then load the `Miniforge3` module instead. There will be ample notice as we roll out this change. + +## Do I need to learn any new technologies? + +No. However, to avoid violating the Anaconda Terms of Service, there are some actions you will need to take. + +## Does this impact my UAB owned laptop, desktop, workstation, or server? + +Yes. If you are currently using Anaconda channels or any part of the Anaconda Distribution for work purposes as an employee of UAB, then that use is in violation of the Anaconda Terms of Service, regardless of the device or computer. + +To remedy this situation, you will need to transition from Anaconda to Miniforge on the affected machines. For UAB managed machines, please contact your IT representatives for assistance with this process. + +## What do I need to do to avoid violating the Terms of Service on Cheaha? + +- Replace `module load Anaconda3` with `module load Miniforge3` in your current projects. +- Remove `defaults`, `anaconda`, and `r` from your channel lists in environment YAML definition files. +- Stop using the `defaults`, `anaconda`, and `r` channels in `conda` commands. + - Avoid `-c defaults`, `-c anaconda`, and `-c r` as part of `conda install` commands. + - Avoid `conda install defaults::$package`, `... anaconda::$package`, and `... r::$package`. + - Instead use the `conda-forge` or `bioconda` channels in `conda install` commands to install packages. +- If you encounter any errors building environments, please contact support. + +## How can I migrate my existing environments? + +- Export existing environments using `conda env export --name $env_name > $env_name.yml` to produce a written record of the environment packages. +- Open the `$env_name.yml` file in a text editor +- Using the text editor, remove the lines under `channels:` that read `- anaconda`, `- defaults`, and `- r`. +- Save the file. +- Reinstall the environment with Miniforge using the command `conda env create --file $env_name.yml`. + +If you encounter any errors please contact support. + +## How can I install a new environment from a file? + + +!!! danger + + Only install environments from files coming from sources you trust. + + +- Obtain a copy of the file from its original source. +- Open the `$env_name.yml` file in a text editor +- Using the text editor, remove the lines under `channels:` that read `- anaconda`, `- defaults`, and `- r`. You may include the `conda-forge` or `bioconda` channel names. +- Save the file. +- Install the environment with Miniforge using the command `conda env create --file $env_name.yml`. + +If you encounter any errors please contact support. + +## What are good practices to minimize impacts in the future? + +- Record your packages and versions in environment YAML files to make your environments reproducible. `` +- Store your environment YAML files in a git repository on GitHub or GitLab to make your environments shareable and collaborative. `` + +## What do I do if I use Anaconda Navigator to build environments? + +At this time there does not appear to be a free-to-use alternative to Anaconda Navigator. You will need to use the terminal to create and manage environments. We have a tutorial and ample documentation covering this [here](./using_conda.md#create-an-environment). If you would like further assistance, please contact support. + +## What do all of the terms relating to `conda` mean? + +- **Anaconda** - An ambiguous term that may refer to the company, its package distribution channels, or its software distribution. Sometimes used to reference the package management software `conda`, though this is not correct. +- **Anaconda Inc.** - The for-profit company that created the well-known ecosystem for scientific python packages. Website: +- **Anaconda Distribution** - The system owned and maintained by Anaconda Inc. that distributes software packages through the `conda` software. +- **Anaconda Cloud** - Platform provided by Anaconda, Inc. that serves as a central repository and collaborative environment for data science projects. +- **`anaconda` channel** - A channel for delivering packages owned and maintained by Anaconda Inc. that is subject to the Anaconda Terms of Service. +- **`conda`** - The software used to manage environments and install packages from the Anaconda Distribution. +- **Miniconda** - It is a minimal installer for `conda`, Miniconda is maintained by Anaconda, Inc. diff --git a/docs/workflow_solutions/getting_containers.md b/docs/workflow_solutions/getting_containers.md index fe74b90ab..500f2fc50 100644 --- a/docs/workflow_solutions/getting_containers.md +++ b/docs/workflow_solutions/getting_containers.md @@ -159,33 +159,37 @@ plt.show() plt.savefig('testing.png') ``` -### Create a Dockerfile that has Miniconda Installed +### Create a Dockerfile that has Miniforge Installed -We require numpy, scipy, and matplotlib libraries to execute the above Python script. Following are the steps to create a specification file and build a container image. +We require numpy, scipy, and matplotlib libraries to execute the above Python script. The following are steps to create a specification file and build a container image. -1. Create an empty directory `miniconda`. +1. Create an empty directory `miniforge`. ```bash - mkdir miniconda + mkdir miniforge ``` -1. Create a `Dockerfile` within the `miniconda` directory with the following contents. The file name `Dockerfile` is case-sensitive. +1. Create a `Dockerfile` within the `miniforge` directory with the following contents. The file name `Dockerfile` is case-sensitive. - ![!Containers create dockerfile.](./images/containers_create_dockerfile.png) + ```bash + nano Dockerfile + ``` + + ![!Containers create dockerfile.](./images/containers_create_dockerfileMF.png) ```bash - # You may start with a base image - # Always use a specific tag like "4.10.3", never "latest"! - # The version referenced by "latest" can change, so the build will be - # more stable when building from a specific version tag. - FROM continuumio/miniconda3:4.12.0 + # You may start with a base image + # Always use a specific tag like "24.3.0-0", never "latest"! + # The version referenced by "latest" can change, so the build will be + # more stable when building from a specific version tag. + FROM condaforge/miniforge-pypy3:24.3.0-0 - # Use RUN to execute commands inside the miniconda image + # Use RUN to execute commands inside the miniforge image RUN conda install -y numpy">=1.16.5, <1.23.0" - # RUN multiple commands together - # Last two lines are cleaning out the local repository and removing the state - # information for installed package + # RUN multiple commands together + # Last two lines are cleaning out the local repository and removing the state + # information for installed package RUN apt-get update \ && conda install -y scipy=1.7.3 \ && conda install -y matplotlib=3.5.1 \ @@ -193,42 +197,42 @@ We require numpy, scipy, and matplotlib libraries to execute the above Python sc && rm -rf /var/lib/apt/lists/* ``` - This is the specification file. It provides Docker with the software information, and versions, it needs to build our new container. See the Docker Container documentation for more information . + This is the specification file. It provides Docker with the software information, and versions, it needs to build our new container. In this case we found the installation via a github page for the software container we want. See the Docker Container documentation for more information . - In the Dockerfile, we start with an existing container `continuumio/miniconda3:4.12.0`. This container is obtained from Dockerhub; here, `continuumio` is the producer, and the repo name is `continuumio/miniconda3`. + In this Dockerfile, we start with an existing container `condaforge/miniforge-pypy3`. This container is obtained from Dockerhub; here, `condaforge` is the producer, and the repo name is `condaforge/miniforge-pypy3`. - ![!Containers dockerhub miniconda.](./images/containers_dockerhub_miniconda.png) + ![!Container Tag and Producer.](./images/containers_produce_tag.png) - You may specify the required version from the `Tag` list. Here the tag/version is `4.12.0`. Also its a very good practice to specify the version of packages for numpy, scipy, and matplotlib for better reproducibility. + You may specify the required version from the `Tag` list for a software container, you will want to use the format. Here the tag/version is `24.3.0-0`. Also its a very good practice to specify the version of packages for numpy, scipy, and matplotlib for better reproducibility. - !!! note "Containers and Reproducibiliy" - Always include version numbers for Anaconda, package managers, software you are installing, and the dependencies for those software. Containers are not by nature scientifically reproducible, but if you include versions for as much software in the container as possible, they can be reproducible years later. + !!! note "Containers and Reproducibility" + Always include version numbers for `conda`, package managers, software you are installing, and the dependencies for those software. Containers are not inherently scientifically reproducible, but they can be made reproducible for years if you include versions for as much software in the container as possible. -1. To build your container, change the directory to `miniconda` and use the below syntax to build the `Dockerfile`. Here we use `.` to say "current directory." This will only work if you are in the directory with the `Dockerfile`. +1. To build your container, make sure you are in the same folder as your `Dockerfile` otherwise change the directory to `miniforge` and use the below syntax to build the `Dockerfile`. Here we use `.` to say "current directory." This will only work if you are in the directory with the `Dockerfile`. ```bash sudo docker build -t repository_name:tag . ``` - Here the repository_name is `py3-miniconda` and the tag is `2022-08`. + Here the repository_name can be `miniforge` and the tag is `24.8`. This are user defined, so whatever you decide to use is fine, just make sure it helps you remember what image you created. ```bash - cd miniconda - sudo docker build -t py3-miniconda:2022-08 . + cd miniforge + sudo docker build -t miniforge:24.8 . ``` - ![!Containers build docker.](./images/containers_build_docker.png) + ![!Containers build docker.](./images/containers_build_dockerMF.png) !!! note - The `.` at the end of the command! This indicates that we're using the current directory as our build environment, including the Dockerfile inside. Also, you may rename the `repository_name` and `tag` as you prefer. + The `.` at the end of the command! indicates that we're using the current directory as our build environment, as well as the Dockerfile inside. You may rename the `repository_name` and `tag` as you prefer. ```bash sudo docker images ``` -![!Containers miniconda docker image.](./images/containers_miniconda_docker_image.png) +![!Containers miniforge docker image.](./images/containers_miniforge_docker_imageMF.png) -### Running the Built Miniconda Docker Container Interactively +### Running the Built Miniforge Docker Container Interactively To run docker interactively and execute commands inside the container, use the below syntax. Here `run` executes the command in a new container, and `-it` starts an interactive shell inside the container. After executing this command, the command prompt will change and move into the bash shell. @@ -236,24 +240,24 @@ To run docker interactively and execute commands inside the container, use the b sudo docker run -it repository_name:tag /bin/bash ``` -To execute your container `py3-miniconda` interactively, run this command with the tag `2022-08'. +To execute your container `miniforge` interactively, run this command with the tag `24.8`. ```bash -sudo docker run -it py3-miniconda:2022-08 /bin/bash +sudo docker run -it miniforge:24.8 /bin/bash cd /opt/conda/bin/ ``` The `python` executables to execute our synthetic python script are within the directory structure `/opt/conda/bin`. -![!Docker interactive.](./images/containers_docker_interactive.png) +![!Docker interactive.](./images/containers_docker_interactiveMF.png) -![!Python executable.](./images/containers_python_executable.png) +![!Python executable.](./images/containers_python_executableMF.png) ### Mounting Data Onto a Container -Before we mount data onto a container, remember you initially created the python script `python_test.py` when creating your own container. Move `python_test.py` within `miniconda` directory. Now you have your `miniconda/python_test.py` outside the container. To access the files outside the container you should mount the file path along with the `docker run` command. +Before we mount data onto a container, remember you initially created the python script `python_test.py` when creating your own container. Move `python_test.py` into the `miniforge` directory. Now you have your `miniforge/python_test.py` outside the container. To access the files outside the container you should mount the file path along with the `docker run` command. -![!Containers python script.](./images/containers_python_script.png) +![!Containers python script.](./images/containers_python_scriptMF.png) To mount a host directory into your docker container, use the `-v` flag. @@ -264,18 +268,18 @@ sudo docker run -v /host/directory/:/container/directory -other-options So the command for our example will be, ```bash -sudo docker run -v /home/ubuntu/:/home -it py3-miniconda:2022-08 /bin/sh +sudo docker run -v /home/ubuntu/:/home -it miniforge:24.8 /bin/sh ``` Here we are mounting the $HOME directory `/home/ubuntu` from a host into containers' $HOME directory. Note that you may mount a particular directory according to your preference. The following shows the list of files in containers' $HOME directory with and without mounting. Before mounting, there are no files found within the $HOME directory. -![!Containers before mounting.](./images/containers_before_mounting.png) +![!Containers before mounting.](./images/containers_before_mountingMF.png) -After mounting using `-v` flag, files show up within the $HOME directory. The highlighted `miniconda` is our working directory with python script. +After mounting using `-v` flag, files show up within the $HOME directory. The highlighted `miniforge` is our working directory with python script. -![!Containers after mounting.](./images/containers_after_mounting.png) +![!Containers after mounting.](./images/containers_after_mountingMF.png) We can now execute the script, python_test.py using this command. @@ -283,7 +287,7 @@ We can now execute the script, python_test.py using this command. python python_test.py ``` -![!Containers python script execution.](./images/containers_python_script_execution.png) +![!Containers python script execution.](./images/containers_python_script_executionMF.png) More lessons on Docker can be found in this link: [Introduction to Docker](https://christinalk.github.io/docker-introduction/) and [Docker Documentation](https://docs.docker.com/reference/dockerfile/). diff --git a/docs/workflow_solutions/images/containers_after_mounting.png b/docs/workflow_solutions/images/containers_after_mounting.png deleted file mode 100644 index 5c04a4e24..000000000 Binary files a/docs/workflow_solutions/images/containers_after_mounting.png and /dev/null differ diff --git a/docs/workflow_solutions/images/containers_after_mountingMF.png b/docs/workflow_solutions/images/containers_after_mountingMF.png new file mode 100644 index 000000000..59208f40d Binary files /dev/null and b/docs/workflow_solutions/images/containers_after_mountingMF.png differ diff --git a/docs/workflow_solutions/images/containers_before_mounting.png b/docs/workflow_solutions/images/containers_before_mounting.png deleted file mode 100644 index de98ec893..000000000 Binary files a/docs/workflow_solutions/images/containers_before_mounting.png and /dev/null differ diff --git a/docs/workflow_solutions/images/containers_before_mountingMF.png b/docs/workflow_solutions/images/containers_before_mountingMF.png new file mode 100644 index 000000000..6a8a8448a Binary files /dev/null and b/docs/workflow_solutions/images/containers_before_mountingMF.png differ diff --git a/docs/workflow_solutions/images/containers_build_dockerMF.png b/docs/workflow_solutions/images/containers_build_dockerMF.png new file mode 100644 index 000000000..4efe3570b Binary files /dev/null and b/docs/workflow_solutions/images/containers_build_dockerMF.png differ diff --git a/docs/workflow_solutions/images/containers_create_dockerfile.png b/docs/workflow_solutions/images/containers_create_dockerfile.png deleted file mode 100644 index 64038a9eb..000000000 Binary files a/docs/workflow_solutions/images/containers_create_dockerfile.png and /dev/null differ diff --git a/docs/workflow_solutions/images/containers_create_dockerfileMF.png b/docs/workflow_solutions/images/containers_create_dockerfileMF.png new file mode 100644 index 000000000..83d530867 Binary files /dev/null and b/docs/workflow_solutions/images/containers_create_dockerfileMF.png differ diff --git a/docs/workflow_solutions/images/containers_docker_interactiveMF.png b/docs/workflow_solutions/images/containers_docker_interactiveMF.png new file mode 100644 index 000000000..016a5f8b4 Binary files /dev/null and b/docs/workflow_solutions/images/containers_docker_interactiveMF.png differ diff --git a/docs/workflow_solutions/images/containers_miniforge_docker_imageMF.png b/docs/workflow_solutions/images/containers_miniforge_docker_imageMF.png new file mode 100644 index 000000000..c0a998da9 Binary files /dev/null and b/docs/workflow_solutions/images/containers_miniforge_docker_imageMF.png differ diff --git a/docs/workflow_solutions/images/containers_produce_tag.png b/docs/workflow_solutions/images/containers_produce_tag.png new file mode 100644 index 000000000..101701e43 Binary files /dev/null and b/docs/workflow_solutions/images/containers_produce_tag.png differ diff --git a/docs/workflow_solutions/images/containers_python_executableMF.png b/docs/workflow_solutions/images/containers_python_executableMF.png new file mode 100644 index 000000000..90c4ff5da Binary files /dev/null and b/docs/workflow_solutions/images/containers_python_executableMF.png differ diff --git a/docs/workflow_solutions/images/containers_python_scriptMF.png b/docs/workflow_solutions/images/containers_python_scriptMF.png new file mode 100644 index 000000000..fa0d24f33 Binary files /dev/null and b/docs/workflow_solutions/images/containers_python_scriptMF.png differ diff --git a/docs/workflow_solutions/images/containers_python_script_executionMF.png b/docs/workflow_solutions/images/containers_python_script_executionMF.png new file mode 100644 index 000000000..8ef93854f Binary files /dev/null and b/docs/workflow_solutions/images/containers_python_script_executionMF.png differ diff --git a/docs/workflow_solutions/r_environments.md b/docs/workflow_solutions/r_environments.md index 3de4a4aab..302ce7703 100644 --- a/docs/workflow_solutions/r_environments.md +++ b/docs/workflow_solutions/r_environments.md @@ -1,6 +1,6 @@ # R Projects and Environments -When working on multiple projects, it's likely that different sets of external analysis packages and their dependencies will be needed for each project. Managing these different projects is simple in something like [Anaconda](using_anaconda.md) by creating a different virtual environment for each project, but this functionality is not fully built into RStudio by default. +When working on multiple projects, it's likely that different sets of external analysis packages and their dependencies will be needed for each project. Managing these different projects is simple in something like [`conda`](using_conda.md) by creating a different virtual environment for each project, but this functionality is not fully built into RStudio by default. Instead, we suggest to take advantage of [R Projects](https://support.posit.co/hc/en-us/articles/200526207-Using-RStudio-Projects) and the [renv](https://rstudio.github.io/renv/articles/renv.html) package to keep environments separate for each project you start. diff --git a/docs/workflow_solutions/shell.md b/docs/workflow_solutions/shell.md index ede71400f..e7f46aacd 100644 --- a/docs/workflow_solutions/shell.md +++ b/docs/workflow_solutions/shell.md @@ -17,7 +17,7 @@ The internet has thousands of guides for using the shell. Rather than devise our There are also additional resources that aid in learning and verifying shell commands and scripts: - [Explain Shell](https://explainshell.com/): An educational tool providing detailed explanations of individual commands in relatively reasonably-plain English. This tool doesn't explain what a command does at a high level nor its purpose or intent, only the details of the parts making up the command. -- [ShellCheck](https://www.shellcheck.net/): An online tool for conducting static analysis checks on shell scripts. The Git repository for this tool can be found [here](https://github.com/koalaman/shellcheck ) and it can also be installed via [Anaconda]( https://anaconda.org/conda-forge/shellcheck). +- [ShellCheck](https://www.shellcheck.net/): An online tool for conducting static analysis checks on shell scripts. The Git repository for this tool can be found [here](https://github.com/koalaman/shellcheck) and it can also be installed via [`conda`](https://anaconda.org/conda-forge/shellcheck) using the conda-forge channel. At the shell prompt, you can also use the command `curl cheat.sh/` to get a simple-to-understand explanation of what the command does and how to use it (see [curl](#download-files-from-internet-sources-curl)). Below is an example for the [pwd command](#show-working-directory-pwd). diff --git a/docs/workflow_solutions/using_anaconda.md b/docs/workflow_solutions/using_anaconda.md deleted file mode 100644 index c603b123a..000000000 --- a/docs/workflow_solutions/using_anaconda.md +++ /dev/null @@ -1,390 +0,0 @@ -# Anaconda - -Python is a high level programming language that is widely used in many branches of science. As a result, many scientific packages have been developed in Python, leading to the development of a package manager called Anaconda. Anaconda is the standard in Python package management for scientific research. - -Benefits of Anaconda: - -- Shareability: environments can be shared via human-readable text-based YAML files. -- Maintainability: the same YAML files can be version controlled using git. -- Repeatability: environments can be rebuilt using those same YAML files. -- Simplicity: dependency matrices are computed and solved by Anaconda, and libraries are pre-built and stored on remote servers for download instead of being built on your local machine. -- Ubiquity: nearly all Python developers are aware of the usage of Anaconda, especially in scientific research, so there are many resources available for learning how to use it, and what to do if something goes wrong. - -Anaconda can also install Pip and record which Pip packages are installed, so Anaconda can do everything Pip can, and more. - - -!!! important - - If using Anaconda on Cheaha, please see our [Anaconda on Cheaha page](../cheaha/software/software.md#anaconda-on-cheaha) for important details and restrictions. - - -## Using Anaconda - -Anaconda is a package manager, meaning it handles all of the difficult mathematics and logistics of figuring out exactly what versions of which packages should be downloaded to meet your needs, or inform you if there is a conflict. - -Anaconda is structured around environments. Environments are self-contained collections of researcher-selected packages. Environments can be changed out using a simple package without requiring tedious installing and uninstalling of packages or software, and avoiding dependency conflicts with each other. Environments allow researchers to work and collaborate on multiple projects, each with different requirements, all on the same computer. Environments can be installed from the command line, from pre-designed or shared YAML files, and can be modified or updated as needed. - -The following subsections detail some of the more common commands and use cases for Anaconda usage. More complete information on this process can be found at the [Anaconda documentation](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#). Need some hands-on experience, you can find instructions on how to install PyTorch and TensorFlow using Anaconda in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). - - -!!! important - - If using Anaconda on Cheaha, please see our [Anaconda on Cheaha page](../cheaha/software/software.md#anaconda-on-cheaha) for important details and restrictions. - - -### Create an Environment - -In order to create a basic environment with the default packages, use the `conda create` command: - -```bash -# create a base environment. Replace with an environment name -conda create -n -``` - -If you are trying to replicate a pipeline or analysis from another person, you can also recreate an environment using a YAML file, if they have provided one. To replicate an environment using a YAML file, use: - -```bash -# replicate an environment from a YAML file named env.yml -conda create -n -f -``` - -By default, all of your conda environments are stored in `/home//.conda/envs`. - -### Activate an Environment - -From here, you can activate the environment using either `source` or `conda`: - -```bash -# activate the virtual environment using source -source activate - -# or using conda -conda activate -``` - -To know your environment has loaded, the command line should look like: - -```text -() [BlazerID@c0XXX ~]$ -``` - -Once the environment is activated, you are allowed to install whichever python libraries you need for your analysis. - -### Install Packages - -To install packages using Anaconda, use the `conda install` command. The `-c` or `--channel` command can be used to select a specific package channel to install from. The `anaconda` channel is a curated collection of high-quality packages, but the very latest versions may not be available on this channel. The `conda-forge` channel is more open, less carefully curated, and has more recent versions. - -```bash -# install most recent version of a package -conda install - -# install a specific version -conda install =version - -# install from a specific conda channel -conda install -c <=version> -``` - -Generally, if a package needs to be downloaded from a specific conda channel, it will mention that in its installation instructions. - -#### Installing Packages with Pip - -Some packages are not available through Anaconda. Often these packages are available via [PyPI](https://pypi.org/) and thus using the Python built-in Pip package manager. Pip may also be used to install locally-available packages as well. - - -!!! important - Make sure `pip` is installed within the `conda` environment and use it for installing packages within the `conda` environment to prevent [Pip related issues](../cheaha/open_ondemand/ood_jupyter.md#pip-installs-packages-outside-of-environment). - - -```bash -# install most recent version of a package -pip install \ - -# install a specific version, note the double equals sign -pip install \==version - -# install a list of packages from a text file -pip install -r packages.txt -``` - -#### Finding Packages - -You may use the [Anaconda page](https://anaconda.org/) to search for packages on Anaconda, or use Google with something like ` conda`. To find packages in PyPI, either use the [PyPI page](https://pypi.org/) to search, or use Google with something like ` pip`. - -#### Packages for Jupyter - -For more information about using Anaconda with Jupyter, see the section [Working with Anaconda Environments](../cheaha/open_ondemand/ood_jupyter.md#working-with-anaconda-environments). - -### Update packages in an environment - -To ensure packages and their dependencies are all up to date, it is a best practice to regularly update installed packages, and libraries in your activated environment. - -```bash - -conda update -—all - -``` - -### Deactivating an Environment - -An environment can be deactivated using the following command. - -```bash -# Using conda -conda deactivate -``` - -Anaconda may say that using `source deactivate` is deprecated, but environment will still be deactivated. - -Closing the terminal will also close out the environment. - -### Deleting an Environment - -To delete an environment, use the following command. Remember to replace `` with the existing environment name. - -```bash - -conda env remove —-name - -``` - -### Working with Environment YAML Files - -#### Exporting an Environment - -To easily share environments with other researchers or replicate it on a new machine, it is useful to create an environment YAML file. You can do this using: - -```bash -# activate the environment if it is not active already -conda activate - -# export the environment to a YAML file -conda env export > env.yml -``` - -#### Creating an Environment from a YAML File - -To create an environment from a YAML file `env.yml`, use the following command. - -```bash -conda env create --file env.yml -``` - -#### Sharing your environment file - -To share your environment for collaboration, there are primarily 3 ways to export environments, the below commands show how to create environment files that can be shared for replication. Remember to replace `` with the existing environment name. - -1. Cross-Platform Compatible - - ```bash - - conda env export --from-history > .yml - - ``` - -1. Platform + Package Specific - - Create .yml file to share, replace `` (represents the name of your environment) and `` (represents the name of the file you want to export) with preferred names for file. - - ```bash - - conda env export > .yml - - ``` - -1. Platform + Package + Channel Specific - - ```bash - - conda list —-explicit > .txt - # OR - conda list —-explicit > .yml - - ``` - -#### Replicability versus Portability - -An environment with only `python 3.10.4`, `numpy 1.21.5` and `jinja2 2.11.2` installed will output something like the following file when `conda env export` is used. This file may be used to precisely replicate the environment as it exists on the machine where `conda env export` was run. Note that the versioning for each package contains two `=` signs. The code like `he774522_0` after the second `=` sign contains hyper-specific build information for the compiled libraries for that package. Sharing this exact file with collaborators may result in frustration if they do not have the exact same operating system and hardware as you, and they would not be able to build this environment. We would say that this environment file is not very portable. - -There are other portability issues: - -- The `prefix: C:\...` line is not used by `conda` in any way and is deprecated. It also shares system information about file locations which is potentially sensitive information. -- The `channels:` group uses `- defaults`, which may vary depending on how you or your collaborator has customized their Anaconda installation. It may result in packages not being found, resulting in environment creation failure. - -```yaml -name: test-env -channels: - - defaults -dependencies: - - blas=1.0=mkl - - bzip2=1.0.8=he774522_0 - - ca-certificates=2022.4.26=haa95532_0 - - certifi=2021.5.30=py310haa95532_0 - - intel-openmp=2021.4.0=haa95532_3556 - - jinja2=2.11.2=pyhd3eb1b0_0 - - libffi=3.4.2=h604cdb4_1 - - markupsafe=2.1.1=py310h2bbff1b_0 - - mkl=2021.4.0=haa95532_640 - - mkl-service=2.4.0=py310h2bbff1b_0 - - mkl_fft=1.3.1=py310ha0764ea_0 - - mkl_random=1.2.2=py310h4ed8f06_0 - - numpy=1.21.5=py310h6d2d95c_2 - - numpy-base=1.21.5=py310h206c741_2 - - openssl=1.1.1o=h2bbff1b_0 - - pip=21.2.4=py310haa95532_0 - - python=3.10.4=hbb2ffb3_0 - - setuptools=61.2.0=py310haa95532_0 - - six=1.16.0=pyhd3eb1b0_1 - - sqlite=3.38.3=h2bbff1b_0 - - tk=8.6.11=h2bbff1b_1 - - tzdata=2022a=hda174b7_0 - - vc=14.2=h21ff451_1 - - vs2015_runtime=14.27.29016=h5e58377_2 - - wheel=0.37.1=pyhd3eb1b0_0 - - wincertstore=0.2=py310haa95532_2 - - xz=5.2.5=h8cc25b3_1 - - zlib=1.2.12=h8cc25b3_2 -prefix: C:\Users\user\Anaconda3\envs\test-env -``` - -To make this a more portable file, suitable for collaboration, some planning is required. Instead of using `conda env export` we can build our own file. Create a new file called `env.yml` using your favorite text editor and add the following. Note we've only listed exactly the packages we installed, and their version numbers, only. This allows Anaconda the flexibility to choose dependencies which do not conflict and do not contain unusable hyper-specific library build information. - -```yaml -name: test-env -channels: - - anaconda -dependencies: - - jinja2=2.11.2 - - numpy=1.21.5 - - python=3.10.4 -``` - -This is a much more readable and portable file suitable for sharing with collaborators. We aren't quite finished though! Some scientific packages on the `conda-forge` channel, and on other channels, can contain dependency errors. Those packages may accidentally pull a version of a dependency that breaks their code. - -For example, the package `markupsafe` made a not-backward-compatible change (a breaking change) to their code between `2.0.1` and `2.1.1`. Dependent packages expected `2.1.1` to be backward compatible, so their packages allowed `2.1.1` as a substitute for `2.0.1`. Since Anaconda chooses the most recent version allowable, package installs broke. To work around this for our environment, we would need to modify the environment to "pin" that package at a specific version, even though we didn't explicitly install it. - -```yaml -name: test-env -channels: - - anaconda -dependencies: - - jinja2=2.11.2 - - markupsafe=2.0.1 - - numpy=1.21.5 - - python=3.10.4 -``` - -Now we can be sure that the correct versions of the software will be installed on our collaborator's machines. - - -!!! note - - The example above is provided only for illustration purposes. The error has since been fixed, but the example above really happened and is helpful to explain version pinning. - - -#### Good Practice for Finding Software Packages on Anaconda - -Finding Anaconda software packages involves searching through the available “Channels” and repositories to locate the specific packages that contain functions that you need for your environment. Channels are Anaconda's way of organizing packages. Channels instruct Anaconda where to look for packages when installation is to be done. The following are Anaconda Channels that are readily used to house majority of the packages used in scientific research. Anaconda, Conda-Forge, BioConda, other Channels also exist. If you want more information on Anaconda Channels please see their [docs](https://docs.anaconda.com/). - -In the sections below, you will see information on how to find key packages you intend to use, ensure the packages are up-to-date, figure out the best way to install them, and finally compose an environment file for portability and replicability. - -##### Step-by-Step Guide to Finding Anaconda Software Packages - -If we find the package at one of the Channel sources mentioned above, we can check the Platform version to ensure it is either "noarch" (if available) or linux. After noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on Anaconda or PyPI. If so, the package is being maintained on Anaconda/PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). We prefer searching using the following methods, and usually have the most success in the order listed below. - -- Using Google: You may already be familiar with the exact Anaconda package name you require. In the event this is not the case, a simple web engine search with key words usually finds the package. For example, a web search for an Anaconda package would be something along the lines of “Anaconda package for `Generic Topic Name`”. Your search results, should return popular package names related to the topic you have searched for. In the sections below, there is an attempt to provide a detailed step-by-step guide on how to find Anaconda packages using “numpy” as an example. - -- Anaconda Cloud: Anaconda Cloud is the primary source for finding Anaconda packages. You can visit [Anaconda Cloud](https://anaconda.org/) and use the search bar to find the package you need. For example, when you get the package name from your web search (using numpy). You will enter name of the package in the search bar as shown below. - -![!Landing page of anaconda.org showing search](images/anaconda_search.png) - -Review results of your search, it is advised to use “Artifacts” that are compatible with the platform you are working with, as well as have the most “Favorites” and “Downloads” numbers. Click on the portion that contains the name of the package (highlighted 3 in the image below). 1 highlights the Artifact, Favorite and Downloads numbers, the selection 2 highlights the Channel where this package is stored. - -![!Anaconda.org page showing download statistics](images/anaconda_channel_package.png) - -Follow the installation instructions you see in the image below. - -![!Anaconda.org page showing package installation instructions](images/install_anaconda_package.png) - -- Using the Conda Search Command: You can use the `conda search ` command directly in your terminal to find packages. Replace `` with the package you would like to search for. To do this on Cheaha, make sure to `module load Anaconda3` first, and follow the instructions to [activate](#activate-an-environment) an environment. Then do `conda search numpy`. You should get a long list of numpy packages. Review this output, but take note of the highlighted portions in the image. The section with a red selection shows the numpy versions that are available, The section with a blue selection shows the channel where each numpy version is stored. - -![!Search output from using conda search in Terminal](images/channel_conda_search.png) - -You can then install numpy with a specific version and from a specific channel with. - -```bash - conda install -c conda-forge numpy=2.0.0rc2 -``` - -- Using Specific Channels: You can also get packages using specific Anaconda Channels listed below. - - - Anaconda Main Channel: The default channel provided by Anaconda, Inc. Visit [Anaconda](https://anaconda.org) - - - Conda-Forge: A community-driven channel with a wide variety of packages.Visit [Conda-Forge](https://conda-forge.org/) - - - Bioconda: A channel specifically for bioinformatics packages. Visit [Bioconda](https://bioconda.github.io/) - -You can specify a channel in your search, and it will show you a list of the packages available in that channel using `conda search -c `, remember to replace and with the channel and package names you are searching for respectively. - -```bash - conda search -c conda-forge numpy -``` - -If we find the package at one of these sources, we check the Platform version to ensure it is either noarch (if available) or linux for it to work on Cheaha ("noarch" is usually preferred for the sake of portability). Noting the version, we can click the "source" or "repo" link (if available) or "homepage". Then we try to find the latest version. For a package found on GitHub, click "Releases" on the right-hand side. Verify that the latest Release is the same as, or very close to, the version on Anaconda or PyPI. If so, the package is being maintained on Anaconda/PyPI and suitable for use. Note the exact software name, version, and channel (if not on PyPI). - -![!Github page for numpy, an Anaconda package](images/github_conda_releases.png) - -If we don't find a package using Google, or the Anaconda/PyPI pages are out of date, then it may become very hard to use the software in an Anaconda environment. It is possible to try installing a git repository using pip, but care must be taken to choose the right commit or tag. You can find more [info here](https://pip.pypa.io/en/stable/cli/pip_install/#examples). To search for a git repository try: - -1. github "name". -1. gitlab "name". - -Remember to replace name with name of Anaconda package. - - -!!! note - -There are issues with out-of-date software. It may have bugs that have since been fixed and so makes for less reproducible science. Documentation may be harder to find if it isn't also matched to the software version. Examining the README.md file for instructions may provide some good information on installing the package. You can also reach out to us for [support](../help/support.md) in installing a package. - - -When we have a complete list of Anaconda packages and Channels, then we can create an environment from scratch with all the dependencies included. For Anaconda packages, add one line to dependencies for each software. For PyPI packages add - pip: under dependencies. Then under - pip:add `==` to pin the version, see below. The advantage to using an environment file is that it can be stored with your project in GitHub or GitLab, giving it all the benefits of [version control](./git_collaboration.md). - -```yaml -name: test-env -dependencies: - - anaconda::matplotlib=3.8.4 # Pinned version from anaconda channel - - conda-forge::python=3.10.4 # Pinned version from conda-forge channel - - pip - - pip: - - numpy==1.26.4 # Pinned version for pip - - git+https://github.com/user/repo.git # Example of installing from a Git repo - - http://insert_package_link_here # For URL links -``` - - For git repos, add them under `- pip:` based on examples [here](https://pip.pypa.io/en/stable/cli/pip_install/#examples). See the section [Replicability versus Portability](#replicability-versus-portability) for more information. - -The above configuration is only for illustration purposes, to show how channels and dependencies can be used. It is best to install all of your packages from conda channels, to avoid your environment breaking. Only packages that are unavailable via conda, should be installed via pip. If you run into challenges please [contact us](../index.md#how-to-contact-us). - -##### Key Things To Remember - -1. Exploring Package Documentation: For each package, check the documentation to understand its features, version history, and compatibility. Documentation can often be found on the Anaconda Cloud package page under the "Documentation" or "Homepage" link shared above in this tutorial. - -1. Regularly consider updating your environment file to manage dependencies and maintain compatible software environments. Also newer software tends to resolve older bugs, consequently improving the state of science. - -1. Verify Package Version and Maintenance: Ensure you are getting the latest version of the package that is compatible with your environment. Verify that the package is actively maintained by checking the source repository (e.g., GitHub, GitLab). Look for recent commits, releases, and issue resolutions. The concepts of version pinning and semantic versioning, explain this in detail. - -##### Version Pinning - -Version pinning in Anaconda environments involves specifying exact versions of packages to ensure consistency and compatibility. This practice is crucial for reproducibility, as it allows environments to be reproduced exactly, a critical component in research and collaborative projects. Version pinning also aids stability, by preventing unexpected changes that could break your environment, code or analysis. This practice also maintains compatibility between different packages that rely on specific dependencies. To implement version pinning, you can create a YAML file that lists the exact versions of all installed packages or specify versions directly when [creating](#create-an-environment) or updating environments using Conda commands. - -##### Semantic Versioning - -[Semantic versioning](https://semver.org) is a versioning scheme using a three-part format (MAJOR.MINOR.PATCH) to convey the significance of changes in a software package. In Anaconda environments, it plays a role in managing compatibility, version pinning, dependency resolution, and updating packages. The MAJOR version indicates incompatible API changes, i.e. same software package but operation and interaction are mostly different from what you are accustomed to in the previous version. The MINOR version adds backward-compatible functionality, i.e. same version of software package but now contains new features and functionality. Operations and interactions are still mostly the same. While PATCH version includes backward-compatible bug fixes, i.e. same major and minor versions now have a slight change, perhaps a bug or some small change, still same features, operations and interactions, just the minor bug fix. Using semantic versioning helps maintain consistency and compatibility by ensuring that updates within the same major version are compatible, and by allowing precise control when specifying package versions. - -In practice, updating a Major version of a package may break your workflow, but may increase software reliability, stability and fix bugs affecting your science. Changing the major version may also introduce new bugs, these concerns and some others are some of the tradeoffs that have to be taken into consideration. Semantic versioning helps with managing Anaconda environments by facilitating precise [version pinning](#version-pinning) and dependency resolution. For instance, you can pin specific versions using Conda commands or specify version ranges to ensure compatibility as shown in the examples above. Semantic versioning also informs upgrade strategies, letting us know when to upgrade packages based on the potential impact of changes. By leveraging semantic versioning, you can maintain stable and consistent environments, which is essential for smooth research workflows. - -#### Good Software Development Practice - -Building on the example above, we can bring in good software development practices to ensure we don't lose track of how our environment is changing as we develop our software or our workflows. If you've ever lost a lot of hard work by accidentally deleting an important file, or forgetting what changes you've made that need to be rolled back, this section is for you. - -Efficient software developers live the mantra "Don't repeat yourself". Part of not repeating yourself is keeping a detailed and meticulous record of changes made as your software grows over time. [Git](git_collaboration.md) is a way to have the computer keep track of those changes digitally. Git can be used to save changes to environment files as they change over time. Remember that each time your environment changes to commit the output of [Exporting your Environment](#exporting-an-environment) to a repository for your project. diff --git a/docs/workflow_solutions/using_conda.md b/docs/workflow_solutions/using_conda.md new file mode 100644 index 000000000..b820ad9d9 --- /dev/null +++ b/docs/workflow_solutions/using_conda.md @@ -0,0 +1,321 @@ +# Why use `conda`? + + +!!! important + + Recent changes to the Anaconda Terms of Service have required all UAB researchers to change how they use `conda`. See our [Conda Migration FAQ](conda_migration_faq.md) for more information. + + +Python is a high level programming language that is widely used in many branches of science. As a result, many scientific software packages have been developed in Python, leading to the development of a package manager called `conda`. `conda` is the most popular and widely-supported Python package management software for scientific research. + +Benefits of `conda`: + +- Shareability: environments can be shared via human-readable, text-based YAML files. +- Maintainability: the same YAML files can be version controlled using git. +- Repeatability: environments can be rebuilt using those same YAML files. Libraries are pre-built and stored on remote servers for download instead of being built on your local machine or on Cheaha, so two computers with the same operating system, requesting the same package version, will end up using the same executable. +- Simplicity: dependency matrices are computed and solved by `conda`, and +- Ubiquity: nearly all Python developers are aware of the usage of `conda`, especially in scientific research, so there are many resources available for learning how to use it, and what to do if something goes wrong. +- Open-source: Does not include any proprietary packages, adhering strictly to open-source principles. + +`conda` can also install `pip` and record which `pip` packages are installed, so `conda` can do everything Pip can, and more. + + +!!! important + + If using `conda` on Cheaha, please see our [`conda` on Cheaha page](../cheaha/software/software.md#conda-on-cheaha) for important details and restrictions. + + +## Important Terms + +- **`conda`**: Refers to the executable software program that researchers interact with to create and manage environments and install packages. +- **Conda**: Refers to a software distribution containing `conda` and related software and features. +- **package**: Research-related software installed and managed by `conda`, held in environments. Packages are selected from channels and downloaded from remote data servers. +- **environment**: A collection of packages that `conda` can manage. Users can switch between environments to allow for development of multiple projects that have different requirements. +- **YAML file**: A structured, human-friendly file defining a single environment. Sharing the file with others allows for replication of an environment. These files enhance collaboration when added to your project's [version control](../workflow_solutions/git.md), especially when shared on [GitHub or GitLab](../workflow_solutions/git_collaboration.md). YAML stands for [Yet Another Markup Language](https://yaml.org/). +- **channel**: A listing of packages available for download. + - The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. + - The `conda-forge` and `bioconda` channels are free to use. +- **version**: A string of numbers and dots `.` denoting the version of a package. Often these are structured like `1.2.3` and most of the time follow [Semantic Versioning](https://semver.org/) conventions, but not always. Larger numbers indicate more recent versions. Some are structured using dates instead like `2024.08`, with more recent dates indicating more recent versions. + + +!!! note + + We use CAPITAL LETTERS to denote where you will need to replace text with your own values, such as `ENVIRONMENT`, `PACKAGE`, `CHANNEL`, and `VERSION`. + + CAPITAL LETTERS prefixed by a dollar sign `$` are shell variables and do not need to be substituted. + + +## Using `conda` + +`conda` is a package manager, meaning it handles all of the difficult mathematics and logistics of figuring out exactly what versions of which packages should be downloaded to meet your needs, or inform you if there is a conflict. + +`conda` is structured around environments. Environments are self-contained collections of researcher-selected packages. Environments can be changed out using a simple package without requiring tedious installing and uninstalling of packages or software, and avoiding dependency conflicts with each other. Environments allow researchers to work and collaborate on multiple projects, each with different requirements, all on the same computer. Environments can be installed from the command line, from pre-designed or shared YAML files, and can be modified or updated as needed. + +The following subsections detail some of the more common commands and use cases for `conda` usage. More complete information on this process can be found at the [`conda` documentation](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#). Need some hands-on experience? You can find instructions on how to install PyTorch and TensorFlow using `conda` in this [tutorial](../cheaha/tutorial/pytorch_tensorflow.md). + + +!!! important + + If using `conda` on Cheaha, please see our [`conda` on Cheaha page](../cheaha/software/software.md#conda-on-cheaha) for important details and restrictions. + + +### Create an Environment + +In order to create an empty environment you can install packages into, use the `conda env create` command. + +```bash +# Create an empty environment. +conda env create --name ENVIRONMENT +``` + +If you are trying to replicate a pipeline or analysis from another person, you can also recreate an environment using a YAML file, if one was provided. + +```bash +# Replicate an environment from a YAML file named environment.yml. +conda env create --file environment.yml +``` + +On Cheaha all of your conda environments are stored in `/home/$USER/.conda/envs`, by default. + +### Activate an Environment + +From here, you can activate the environment using the `conda activate` command. + +```bash +conda activate ENVIRONMENT +``` + +When your environment has loaded, your terminal prompt should change to look similar to the following. + +```text +(ENVIRONMENT) [BlazerID@c0000 ~]$ +``` + +Once the environment is activated, you are able to install any python libraries needed for your analysis. + +### Install Packages + +To install packages using `conda`, use the `conda install` command. + + +!!! important + + The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. The `conda-forge` and `bioconda` channels are free to use. + + +```bash +# Install from default channels. NOT recommended! +conda install PACKAGE # most recent version possible +conda install PACKAGE=VERSION # specified version + +# Install from a specified channel. Recommended! +conda install CHANNEL::PACKAGE # most recent version possible +conda install CHANNEL::PACKAGE=VERSION # specified version +``` + +#### Installing Packages with Pip + +When building a `conda` environment, prefer to get all of your packages through `conda` channels to maximize compatibility. Some packages are not available through `conda` channels. Often these packages are available via [PyPI](https://pypi.org/) and may be installed using the Pip package manager. Pip may also be used to install locally-available packages, and directly from GitHub and GitLab repositories. + + +!!! important + + When using `conda` and `pip` on Cheaha, make sure you are using a custom `conda` environment and that `pip` is installed before installing `pip` packages to prevent severe [`pip` related issues](../cheaha/software/software.md#installing-pip-packages-outside-of-your-environments). + + + +!!! warning + + There are some hard-to-diagnose error that occur when installing packages using `pip` on Windows. The errors occur for Python versions between `3.10.4` and `3.10.8`, and may impact others in the `3.10` series. To maximize shareability, it is recommended to avoid those versions of Python, if possible. The issue does not appear to occur with Python `3.10.14`. + + +```bash +# Install packages using pip. +pip install PACKAGE # most recent version possible +pip install PACKAGE==VERSIOn # specified version, note the `==` +pip install -r packages.txt # multiple packages from a list in a text file +``` + +#### Finding Packages + + +!!! important + + The `anaconda` and `r` channels are subject to the Anaconda Terms of Service and may not be used for UAB business. The `conda-forge` and `bioconda` channels are free to use. + + +To find packages available on `conda` channels, use a search engine like Google. Start by searching for `PACKAGE conda-forge`. Replace `PACKAGE` with the name of the package. You might also try `bioconda` instead of `conda-forge`. If the package has a name shared with non-software products or ideas, you may need to add `software` or `research`, or both, to the end of your search string. You can also search on , but be sure the package you find is not from a channel (anaconda, default, or r channels) subject to the Anaconda Terms of Service. + +For packages in PyPI, repeat the process above but use `pypi` in place of `conda-forge` in the search string, or search directly on . + +#### Packages for Jupyter + +For more information about using `conda` with Jupyter, see the section [Working with `conda` Environments](../cheaha/open_ondemand/ood_jupyter.md#working-with-conda-environments). + +### Update packages in an environment + +In research, there is a balance to be struck between keeping software up-to-date and ensuring replicability of outputs. Updating software regularly ensures you have the most recent bug fixes and the highest level of security. Not updating software means you can be sure the software will behave consistently across all of your data. + +When coming up with a software analysis strategy, carefully consider the following questions. + +- What parts of my workflow can be done all at once after experiments are done? +- What parts of my workflow must be done as data is acquired? +- What are the specific benefits of updating a software package? + - Fixing a bug that causes incorrect output? + - Major security holes patched? +- What are the costs of updating? + - Will I have to re-run some or all of my analyses? + - Will I have to update other parts of my workflow code? +- Will I have to update other packages, and what will those impacts be? +- Does a particular update change outputs? Why did the output change? + +To perform an update on the currently [activated](#activate-an-environment) environment, use the `conda update` command. + +```bash +conda update PACKAGE # updates to the most recent version possible +conda update PACKAGE=VERSION # updates (or downgrades) to a specific version +conda update -—all # updates all packages to the most recent version possible +``` + +### Deactivating an Environment + +An environment can be deactivated using the following command. + +```bash +conda deactivate +``` + +Closing the terminal will also close out the environment. + +### Deleting an Environment + +To delete an environment, use the following command. + +```bash +conda ENVIRONMENT remove --name +``` + +### Working with Environment YAML Files + +#### Exporting an Environment + +To easily share environments with other researchers or replicate it on a new machine, it is useful to create an environment YAML file. + +```bash +# activate the environment if it is not active already +conda activate ENVIRONMENT + +# export the environment to a YAML file +conda env export > environment.yml +``` + +#### Creating an Environment from a YAML File + +To create an environment from a YAML file `environment.yml`, use the following command. + +```bash +conda env create --file environment.yml +``` + +#### Sharing your environment file + +To share your environment for collaboration, there are three ways to export environments. + +```bash +# Cross-platform compatible. +conda env export --name ENVIRONMENT --from-history > environment.yml + +# Platform and package specific. +conda env export --name ENVIRONMENT > environment.yml + +# Platform and package and channel specific +conda list --name ENVIRONMENT --explicit > environment.yml +``` + +#### Replicability versus Portability + +An environment with only `python 3.10.4`, `numpy 1.21.5` and `jinja2 2.11.2` installed will output something like the following file when `conda env export` is used. This file may be used to precisely replicate the environment as it exists on the machine where `conda env export` was run. Note that the versioning for each package contains two `=` signs. The code like `he774522_0` after the second `=` sign contains hyper-specific build information for the compiled libraries for that package. Sharing this exact file with collaborators may result in frustration if they do not have the exact same operating system and hardware as you, and they would not be able to build this environment. We would say that this environment file is not very portable. + +There are other portability issues: + +- The `prefix: C:\...` line is not used by `conda` in any way and is deprecated. It also shares system information about file locations which is potentially sensitive information. +- The `channels:` ensure your channels include the correct locations for where your packages can be found, this may vary depending on how you or your collaborator has customized their `conda` installation. It may result in packages not being found, resulting in environment creation failure. + +```yaml +name: test-env +channels: + - conda-forge +dependencies: + - blas=1.0=mkl + - bzip2=1.0.8=he774522_0 + - ca-certificates=2022.4.26=haa95532_0 + - certifi=2021.5.30=py310haa95532_0 + - intel-openmp=2021.4.0=haa95532_3556 + - jinja2=2.11.2=pyhd3eb1b0_0 + - libffi=3.4.2=h604cdb4_1 + - markupsafe=2.1.1=py310h2bbff1b_0 + - mkl=2021.4.0=haa95532_640 + - mkl-service=2.4.0=py310h2bbff1b_0 + - mkl_fft=1.3.1=py310ha0764ea_0 + - mkl_random=1.2.2=py310h4ed8f06_0 + - numpy=1.21.5=py310h6d2d95c_2 + - numpy-base=1.21.5=py310h206c741_2 + - openssl=1.1.1o=h2bbff1b_0 + - pip=21.2.4=py310haa95532_0 + - python=3.10.4=hbb2ffb3_0 + - setuptools=61.2.0=py310haa95532_0 + - six=1.16.0=pyhd3eb1b0_1 + - sqlite=3.38.3=h2bbff1b_0 + - tk=8.6.11=h2bbff1b_1 + - tzdata=2022a=hda174b7_0 + - vc=14.2=h21ff451_1 + - vs2015_runtime=14.27.29016=h5e58377_2 + - wheel=0.37.1=pyhd3eb1b0_0 + - wincertstore=0.2=py310haa95532_2 + - xz=5.2.5=h8cc25b3_1 + - zlib=1.2.12=h8cc25b3_2 +prefix: C:\Users\user\miniforge3\envs\test-env +``` + +To make this a more portable file, suitable for collaboration, some planning is required. Instead of using `conda env export` we can build our own file. Create a new file called `environment.yml` using your favorite text editor and add the following. Note we've only listed exactly the packages we installed, and their version numbers, only. This allows `conda` the flexibility to choose dependencies which do not conflict and do not contain unusable hyper-specific library build information. + +```yaml +name: test-env +channels: + - conda-forge +dependencies: + - jinja2=2.11.2 + - numpy=1.21.5 + - python=3.10.4 +``` + +This is a much more readable and portable file suitable for sharing with collaborators. We aren't quite finished though! Some scientific packages on the `conda-forge` channel, and on other channels, can contain dependency errors. Those packages may accidentally pull a version of a dependency that breaks their code. + +For example, the package `markupsafe` made a not-backward-compatible change (a breaking change) to their code between `2.0.1` and `2.1.1`. Dependent packages expected `2.1.1` to be backward compatible, so their packages allowed `2.1.1` as a substitute for `2.0.1`. Since `conda` chooses the most recent version allowable, package installs broke. To work around this for our environment, we would need to modify the environment to "pin" that package at a specific version, even though we didn't explicitly install it. + +```yaml +name: test-env +channels: + - conda-forge +dependencies: + - jinja2=2.11.2 + - markupsafe=2.0.1 + - numpy=1.21.5 + - python=3.10.4 +``` + +Now we can be sure that the correct versions of the software will be installed on our collaborator's machines. + +It is important to be aware that by generalizing the YAML file in this way, the results you and your collaborator each generate may be different. This could be due to the previously-mentioned difference in hardware and operating system. If precise replication is required, more effort may be required such as using [Containers](getting_containers.md#create-your-own-docker-container) to ensure a consistent operating system environment. + + +!!! note + + The example above is provided only for illustration purposes. The error has since been fixed, but the example above really happened and is helpful to explain version pinning. + + +#### Good Software Development Practice + +Building on the example above, we can bring in good software development practices to ensure we don't lose track of how our environment is changing as we develop our software or our workflows. If you've ever lost a lot of hard work by accidentally deleting an important file, or forgetting what changes you've made that need to be rolled back, this section is for you. + +Efficient software developers live the mantra "Don't repeat yourself". Part of not repeating yourself is keeping a detailed and meticulous record of changes made as your software grows over time. [Git](git_collaboration.md) is a way to have the computer keep track of those changes digitally. Git can be used to save changes to environment files as they change over time. Remember that each time your environment changes to commit the output of [Exporting your Environment](#exporting-an-environment) to a repository for your project. diff --git a/mkdocs.yml b/mkdocs.yml index afd0a4bac..59d969ccf 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -63,7 +63,7 @@ plugins: data_management/LTS/sharing.md: data_management/lts/policies.md data_management/lts/lts.md: data_management/lts/index.md data_management/lts/sharing.md: data_management/lts/policies.md - environment_management/anaconda_environments.md: workflow_solutions/using_anaconda.md + environment_management/anaconda_environments.md: workflow_solutions/using_conda.md environment_management/containers.md: workflow_solutions/getting_containers.md environment_management/git.md: workflow_solutions/git_collaboration.md uab_cloud/cloud_remote_access.md: uab_cloud/remote_access.md @@ -96,9 +96,9 @@ nav: - Storage Alternatives: data_management/alternate_storage.md - Long Term Storage: - data_management/lts/index.md - - Tutorial: - - data_management/lts/tutorial/index.md - - LTS and s3cmd Workflow: data_management/lts/tutorial/individual_lts_tutorial.md + - Tutorial: + - data_management/lts/tutorial/index.md + - LTS and s3cmd Workflow: data_management/lts/tutorial/individual_lts_tutorial.md - Interfacing with LTS: data_management/lts/interfaces.md - Bucket Permissions: data_management/lts/policies.md - UAB Core Accounts: data_management/lts/lts_cores.md @@ -110,7 +110,8 @@ nav: - Code Storage: data_management/code_storage.md - Workflow Solutions: - Using the Shell: workflow_solutions/shell.md - - Using Anaconda: workflow_solutions/using_anaconda.md + - Using Conda: workflow_solutions/using_conda.md + - Conda Migration FAQ: workflow_solutions/conda_migration_faq.md - Using Workflow Managers: workflow_solutions/using_workflow_managers.md - Using Git: workflow_solutions/git.md - R Projects and Environments: workflow_solutions/r_environments.md @@ -120,7 +121,8 @@ nav: - Getting Started: cheaha/getting_started.md - Tutorials: - cheaha/tutorial/index.md - - Anaconda Environment Tutorial: cheaha/tutorial/pytorch_tensorflow.md + - Conda Environment Tutorial: cheaha/tutorial/pytorch_tensorflow.md + - Conda Good Practice: cheaha/tutorial/conda_good_practice.md - Cheaha Web Portal: - cheaha/open_ondemand/index.md - Using the Web Portal: cheaha/open_ondemand/ood_layout.md