Generation Strategies
Visualize how different text generation strategies select the next token
How This Works
This tool shows real probability distributions from GPT-2. After tokenizing your prompt, the model outputs logits (raw scores) for all 50,257 tokens in its vocabulary. We apply softmax normalization to convert these to probabilities, then show how different sampling strategies would select the next token. Hover over the icons below for detailed mathematical explanations!