You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now the Chrome implementation throws if temperature is outside the [0, maxTemperature] range, and similarly for top-K.
This might pose an interop problem, or a compat problem over time. Although ideally developers would check maxTemperature before providing a temperature value, maybe we should make it easier for them.
The proposal would be to change from throwing, to clamping to the nearest valid value. The developer could detect this clamping by checking the session's temperature (and topK).
This opens up the possibility of people writing code such as
ai.languageModel.create({temperature: Infinity})
to get the max temperature intentionally. Probably that's fine.
The text was updated successfully, but these errors were encountered:
I think it's fine to clamp. Is this something worth logging a console warning for, though? While ai.languageModel.create({ temperature: Infinity }) definitely is intentional, someone trying ai.languageModel.create({ temperature: 1.0 }) and then ai.languageModel.create({ temperature: 1.5 }) (assuming a [0.0, 1.0] interval) may expect a change to happen, thinking the model accepts [0.0, 2.0] (as per #41 (comment)) for example.
Right now the Chrome implementation throws if temperature is outside the
[0, maxTemperature]
range, and similarly for top-K.This might pose an interop problem, or a compat problem over time. Although ideally developers would check
maxTemperature
before providing a temperature value, maybe we should make it easier for them.The proposal would be to change from throwing, to clamping to the nearest valid value. The developer could detect this clamping by checking the session's
temperature
(andtopK
).This opens up the possibility of people writing code such as
to get the max temperature intentionally. Probably that's fine.
The text was updated successfully, but these errors were encountered: