Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Guided choice not respected #612

Closed
Andrea-de-Varda opened this issue Sep 13, 2024 · 2 comments
Closed

Guided choice not respected #612

Andrea-de-Varda opened this issue Sep 13, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@Andrea-de-Varda
Copy link

Describe the bug
The guided_choice option in llmengine.Completion.create() is not respected in the API call. The response ignores the guided choices I specify, regardless of the model I'm considering, and produces its usual completion output. I installed in a clean environment.

LLM Engine Version

  • LLM Engine Version: 0.0.0beta39

System Version

  • Python Version: 3.10.12
  • Operating System: Ubuntu 22.04.4 LTS

Timestamp and Request ID

  • timestamp:
  • request_id: ade60281-7feb-48ce-921b-d7d081e2a73d

Minimal Reproducible Example
Steps to reproduce the behavior:

  1. Install LLM Engine
pip install scale-llm-engine
  1. Make API call
from llmengine import api_engine, Completion

api_engine.set_api_key("MASKED")

response = Completion.create(
    model="llama-2-7b", # The issues persists with all the models I tried
    prompt="Hello, my favourite",
    max_new_tokens=10,
    temperature=0,
    guided_choice=["food", "thing", "car"],
)

print(response.json())
  1. See error (the response text does not contain any of the options in the guided_choice list)
{"request_id": "ade60281-7feb-48ce-921b-d7d081e2a73d", 
 "output": {"text": " time of the year is here again. I love",  "num_prompt_tokens": null, "num_completion_tokens": 10, "tokens": null}}

Expected behavior
One of the tokens or string from the guided_choice list should have been returned

Additional context
Additionally, when I import, I get the message:

A newer version (0.0.0b39) of 'scale-llm-engine' is available. Please upgrade!
To upgrade, run: pip install --upgrade scale-llm-engine
Don't want to see this message? Set the environment variable 'LLM_ENGINE_DISABLE_VERSION_CHECK' to 'true'.

However, I am already using the latest version:

import llmengine
print(llmengine.__version__) # returns 0.0.0beta39
@Andrea-de-Varda Andrea-de-Varda added the bug Something isn't working label Sep 13, 2024
@yixu34 yixu34 assigned yunfeng-scale and unassigned dmchoiboi Sep 19, 2024
@yixu34
Copy link
Member

yixu34 commented Sep 19, 2024

Hi @Andrea-de-Varda, thanks for filing this issue! We're triaging.

@yixu34
Copy link
Member

yixu34 commented Sep 24, 2024

Hi @Andrea-de-Varda , please see #619 - we do have this feature in the code, but we're sunsetting the free demo, which is out of date with what's in Github and what we run internally.

@yixu34 yixu34 closed this as completed Sep 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants