Skip to content

Latest commit

 

History

History
32 lines (27 loc) · 1.12 KB

File metadata and controls

32 lines (27 loc) · 1.12 KB

Run with Docker

  1. Install Docker (with GPU Support)

    Ensure that Docker is installed and configured with GPU support. Follow these steps:

    • Install Docker if not already installed.
    • Install the NVIDIA Container Toolkit to enable GPU support.
    • Verify the setup with:
      docker run --rm --gpus all nvidia/cuda:12.1.0-base-ubuntu22.04 nvidia-smi
  2. Pull the Docker image, which was built based on this Dockerfile

    docker pull nvcr.io/nvidia/pytorch:25.09-py3
  3. Clone this repository and cd into it

    git clone https://github.com/NVIDIA-Digital-Bio/RNAPro
    cd ./RNAPro
  4. Run Docker with an interactive shell

    docker run --gpus all -it -v $(pwd):/workspace -v nvcr.io/nvidia/pytorch:25.09-py3 /bin/bash
  5. Install RNAPro

    pip install -e .

After running the above commands, you can train and run inference with RNAPro inside the container's environment.