Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/top p sampling #1360

Merged
merged 19 commits into from
May 3, 2024
Merged

Conversation

belerico
Copy link
Contributor

This PR adds the nucleus-sampling (aka top-p sampling) as specified from https://arxiv.org/abs/1904.09751.
In top-p sampling the next token is chosen from the smallest set of tokens with a cumulative probability greater than top-p, i.e. by selecting the highest probability tokens whose cumulative probability exceeds the top-p threshold.

Copy link
Collaborator

@rasbt rasbt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great, thanks a lot! Only a few minor points from my side below:

litgpt/chat/base.py Outdated Show resolved Hide resolved
litgpt/generate/adapter.py Outdated Show resolved Hide resolved
litgpt/generate/adapter_v2.py Outdated Show resolved Hide resolved
litgpt/generate/base.py Outdated Show resolved Hide resolved
litgpt/generate/base.py Outdated Show resolved Hide resolved
litgpt/generate/sequentially.py Outdated Show resolved Hide resolved
litgpt/generate/tp.py Outdated Show resolved Hide resolved
litgpt/generate/base.py Outdated Show resolved Hide resolved
@belerico belerico requested review from rasbt and carmocca April 25, 2024 23:20
litgpt/chat/base.py Outdated Show resolved Hide resolved
@belerico belerico requested a review from rasbt April 26, 2024 16:52
litgpt/generate/base.py Outdated Show resolved Hide resolved
@belerico belerico requested a review from rasbt April 27, 2024 16:12
tests/test_generate.py Outdated Show resolved Hide resolved
litgpt/generate/base.py Outdated Show resolved Hide resolved
@belerico belerico requested a review from carmocca April 29, 2024 14:45
litgpt/generate/base.py Outdated Show resolved Hide resolved
@rasbt
Copy link
Collaborator

rasbt commented May 2, 2024

Thanks for all the updates and fixes. It looks all great to me now. The only thing is perhaps adding one more unit test, but I can take care of that to make it easier. [done]

@rasbt rasbt merged commit d39b26a into Lightning-AI:main May 3, 2024
9 checks passed
@belerico
Copy link
Contributor Author

belerico commented May 3, 2024

Thanks for all the updates and fixes. It looks all great to me now. The only thing is perhaps adding one more unit test, but I can take care of that to make it easier. [done]

Thank you @rasbt: i had missed your comment

@rasbt
Copy link
Collaborator

rasbt commented May 3, 2024

No worries at all, I also thought it was probably quicker to just add instead of explain 😅

Comment on lines +115 to +127
def test_generate_different_results_with_different_top_p():
config = Config(block_size=128, vocab_size=16, n_layer=1, n_head=4, n_embd=8)
model = GPT(config)
model.max_seq_length = 50
model.set_kv_cache(batch_size=1)

torch.manual_seed(123)
input_idx = torch.randint(10, size=(1,))

output1 = generate.generate(model, input_idx, 20, top_p=1.0)
output2 = generate.generate(model, input_idx, 20, top_p=0.1)

assert not torch.equal(output1, output2)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test is not useful because it will also pass if you set the same top_p. That's because multinomial advances the rng state.

For it to achieve the intended result, you need to seed before each call

    torch.manual_seed(123)
    input_idx = torch.randint(10, size=(1,))

    torch.manual_seed(123)
    output1 = generate.generate(model, input_idx, 20, top_p=1.0)
    torch.manual_seed(123)
    output2 = generate.generate(model, input_idx, 20, top_p=0.1)

cc @rasbt

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

arg, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants