Skip to content

[Bug] Model configuration for "gpt-4.1-mini" is missing the "Streaming" feature toggle #1969

@bluecoralphuocvuu

Description

@bluecoralphuocvuu

Plugin Name

OpenAI

Function Description

Description

I am using the OpenAI plugin (or a compatible provider) in Dify. I noticed that for the model listed as gpt-4.1-mini (which I believe corresponds to gpt-4o-mini), the Streaming toggle is missing from the "Parameters" settings panel.

However, for other models in the same list (e.g., gpt-5-mini), the Streaming toggle appears and functions correctly.

According to OpenAI documentation, the gpt-4o-mini model fully supports streaming. It seems the model configuration/schema in Dify is missing the streaming feature flag for this specific model ID.

Expected Behavior

The "Streaming" toggle (True/False) should be visible and selectable for the gpt-4.1-mini model, similar to other chat models.

Actual Behavior

The "Streaming" toggle is completely hidden in the UI for gpt-4.1-mini.

Screenshots

  1. gpt-4.1-mini (Missing Streaming):
Image
  1. gpt-5-mini (Streaming Available - for comparison):
Image

Additional Context

  • Looking at the UI, it seems the backend configuration for gpt-4.1-mini might be missing features: - streaming or supports_streaming: true in the model definition file.
  • Please update the model definition to enable streaming for this model.

Official Website URL

https://github.com/langgenius/dify-official-plugins/blob/main/models/openai/models/llm/gpt-4.1-mini.yaml

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions