Skip to content

GPU Execution Fails in Official Colab Notebook ([CUDA] XlaCallModule Error) #6

@Fiza-bracketsltd

Description

@Fiza-bracketsltd

The official Colab notebook for the Derm Foundation model named 'quick_start_with_hugging_face.ipynb' fails to run on GPU with the error:

NotFoundError: Graph execution error:

Detected at node XlaCallModule defined at (most recent call last):

The current platform CUDA is not among the platforms required by the module: [CPU]
[[{{node XlaCallModule}}]]
tf2xla conversion failed while converting _inference_2174[]. Run with TF_DUMP_GRAPH_PREFIX=/path/to/dump/dir and --vmodule=xla_compiler=2 to obtain a dump of the compiled functions.
[[StatefulPartitionedCall/StatefulPartitionedCall]] [Op:__inference_signature_wrapper_inference_fn_5294]


This occurs when loading the model from Hugging Face and attempting inference on GPU. The same notebook works on CPU, but GPU execution is broken.

Steps to Reproduce:

  1. Open the official Colab notebook: [Link to Notebook]
  2. Change runtime to GPU (Runtime → Change runtime type → T4 GPU).
  3. Run the notebook until model inference.
  4. Error occurs at 'output = infer(inputs=tf.constant([input_tensor]))'.

Troubleshooting Attempted:

  • Forced CPU execution (works, but too slow).
  • Verified CUDA/cuDNN compatibility (GPU is detected).
  • Tested on multiple Colab instances.

Urgency:
This blocks GPU-accelerated workflows for researchers and clinicians using the official notebook. Please:

  1. Clarify if GPU support is intended for this model.
  2. Provide a GPU-compatible version or fix the Colab notebook.
  3. Update the Hugging Face model card with GPU compatibility details.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions