Skip to content

Commit

Permalink
Disable yet another flaky Ollama test point
Browse files Browse the repository at this point in the history
It is unclear at this time why this test point is unreliable, but it just started failing in the GitHub CI, possibly following some Ollama update.

We are not explicitly promising this behaviour, and the change was not on our side.
  • Loading branch information
ccreutzi committed Jul 19, 2024
1 parent c454452 commit fc1798c
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions tests/tollamaChat.m
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,11 @@ function doGenerateUsingSystemPrompt(testCase)
end

function extremeTopK(testCase)
%% This should work, and it does on some computers. On others, Ollama
%% receives the parameter, but either Ollama or llama.cpp fails to
%% honor it correctly.
testCase.assumeTrue(false,"disabled due to Ollama/llama.cpp not honoring parameter reliably");

% setting top-k to k=1 leaves no random choice,
% so we expect to get a fixed response.
chat = ollamaChat("mistral",TopK=1);
Expand Down

0 comments on commit fc1798c

Please sign in to comment.