Skip to content

Make StreamFun work with ollamaChat and azureChat #77

Make StreamFun work with ollamaChat and azureChat

Make StreamFun work with ollamaChat and azureChat #77

Workflow file for this run

name: Run MATLAB Tests on GitHub-Hosted Runner
on: [push]
jobs:
test:
name: Run MATLAB Tests and Generate Artifacts
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Install Ollama
run: |
curl -fsSL https://ollama.com/install.sh | sudo -E sh
- name: Start serving
run: |
# Run the background, there is no way to daemonise at the moment
ollama serve &
# A short pause is required before the HTTP port is opened
sleep 5
# This endpoint blocks until ready
time curl -i http://localhost:11434
- name: Pull mistral model
run: |
ollama pull mistral
- name: Set up MATLAB
uses: matlab-actions/setup-matlab@v2
with:
products: Text_Analytics_Toolbox
cache: true
- name: Run tests and generate artifacts
env:
OPENAI_KEY: ${{ secrets.OPENAI_KEY }}
AZURE_OPENAI_DEPLOYMENT: ${{ secrets.AZURE_DEPLOYMENT }}
AZURE_OPENAI_ENDPOINT: ${{ secrets.AZURE_ENDPOINT }}
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_KEY }}
uses: matlab-actions/run-tests@v2
with:
test-results-junit: test-results/results.xml
code-coverage-cobertura: code-coverage/coverage.xml
source-folder: .
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v4
with:
token: ${{ secrets.CODECOV_TOKEN }}
slug: matlab-deep-learning/llms-with-matlab