Demonstrates Entropy-based Dynamic Temperature (EDT) sampling compared to fixed temperature.
node entropy.mjsThree prompts comparing fixed T=0.7 vs EDT:
| Prompt Type | Fixed T=0.7 | EDT | Why EDT Helps |
|---|---|---|---|
| Factual "2+2" | Uses T=0.7 (wasteful randomness) | Uses T≈0.04 | Model is confident, don't add noise |
| Creative story | T=0.7 (ok) | T varies 0.3-0.9 | Adapts: confident words low T, uncertain words high T |
| Technical explanation | Higher entropy | Lower entropy, T≈0.5 | Stays focused on known facts |
T = T₀ · N^(θ/Entropy)
T₀=1.0max temperatureN=0.8baseθ=1.5scale factorEntropyin nats
| Entropy | Temperature | Rationale |
|---|---|---|
| Low (confident) | Low | Trust the model |
| High (uncertain) | Higher | Explore alternatives |
Counter-intuitive: When the model knows the answer, don't add randomness.
- EDT Paper - Zhang et al. 2024
- EAGER - Entropy-aware inference scaling
- Locally Typical Sampling - Information-theoretic approach