Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Long generations are cut off in the webui #36

Open
Lesani opened this issue Apr 5, 2023 · 3 comments
Open

Long generations are cut off in the webui #36

Lesani opened this issue Apr 5, 2023 · 3 comments

Comments

@Lesani
Copy link

Lesani commented Apr 5, 2023

with standard settings and both models I have - alpaca 7b and 13b-gpt-x longer generations are cut off in the webui.
image

after a while text stops appearing, the debug console shows only status messages, no more "polling" messages, but CPU usage stays up and the UI also shows the "stop generating" button

upon pressing that button a minute later, the console shows the much longer message (it was still generating), the message is not shown in the web interface

image

@ViperX7
Copy link
Owner

ViperX7 commented Apr 5, 2023

can you provide entire transcript

i suspect that this issue is related to windows somehow

actually i have pushed some changes to my fork of llama.cpp i think they might fix the issue

@fblgit
Copy link

fblgit commented Apr 9, 2023

There is something weird... when post like a 'medium-size' text.

Immediately see this:

inp( #2) :  inp( #2) :  inp( #2) :  inp( #2) :  inp( #2) :  inp( #2) :  inp( #2)

then the answer...

answer for the question

followed by a loop

### Human: continue
### Assistant: Additionally, ...

I find a bit concerning this behaviour on many new models, where the stream gets unlimited by just these human/assistant loops..

@ViperX7
Copy link
Owner

ViperX7 commented Apr 14, 2023

This should be fixed in the latest release please check and confirm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants