Skip to content

Commit

Permalink
Update chat model examples
Browse files Browse the repository at this point in the history
  • Loading branch information
ThomasVitale committed Aug 28, 2024
1 parent 7b059f2 commit 272af70
Show file tree
Hide file tree
Showing 39 changed files with 208 additions and 48 deletions.
2 changes: 0 additions & 2 deletions 01-chat-models/chat-models-mistral-ai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,6 @@ class ChatController {

The application relies on the Mistral AI API for providing LLMs.

### When using Mistral AI

First, make sure you have a [Mistral AI account](https://console.mistral.ai).
Then, define an environment variable with the Mistral AI API Key associated to your Mistral AI account as the value.

Expand Down
2 changes: 0 additions & 2 deletions 01-chat-models/chat-models-multiple-providers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@ This example shows how to use both OpenAI and Mistral AI in the same application

The application relies on OpenAI API and Mistral AI API for providing LLMs.

### When using OpenAI and Mistral AI

First, make sure you have an [OpenAI account](https://platform.openai.com/signup).
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ String chatWithOpenAiOptions(@RequestParam(defaultValue = "What did Gandalf say
return openAichatClient.prompt()
.user(question)
.options(OpenAiChatOptions.builder()
.withModel("gpt-4-turbo")
.withModel("gpt-4o-mini")
.withTemperature(1.0f)
.build())
.call()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ String chatWithMistralAiOptions(@RequestParam(defaultValue = "What did Gandalf s
@GetMapping("/chat/openai-options")
String chatWithOpenAiOptions(@RequestParam(defaultValue = "What did Gandalf say to the Balrog?") String question) {
return openAiChatModel.call(new Prompt(question, OpenAiChatOptions.builder()
.withModel("gpt-4-turbo")
.withModel("gpt-4o-mini")
.withTemperature(1.0f)
.build()))
.getResult().getOutput().getContent();
Expand Down
4 changes: 1 addition & 3 deletions 01-chat-models/chat-models-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,6 @@ class ChatController {

The application relies on an OpenAI API for providing LLMs.

### When using OpenAI

First, make sure you have an [OpenAI account](https://platform.openai.com/signup).
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down Expand Up @@ -86,7 +84,7 @@ The next request is configured with a custom temperature value to obtain a more
http :8080/chat/generic-options question=="Why is a raven like a writing desk? Give a short answer." -b
```

The next request is configured with Open AI-specific customizations.
The next request is configured with OpenAI-specific customizations.

```shell
http :8080/chat/provider-options question=="What can you see beyond what you can see? Give a short answer." -b
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,9 +44,9 @@ String chatWithProviderOptions(@RequestParam(defaultValue = "What did Gandalf sa
return chatClient.prompt()
.user(question)
.options(OpenAiChatOptions.builder()
.withModel("gpt-4-turbo")
.withModel("gpt-4o-mini")
.withTemperature(0.9f)
.withUser("jon.snow")
.withFrequencyPenalty(1.3f)
.build())
.call()
.content();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,9 @@ String chatWithGenericOptions(@RequestParam(defaultValue = "What did Gandalf say
@GetMapping("/chat/provider-options")
String chatWithProviderOptions(@RequestParam(defaultValue = "What did Gandalf say to the Balrog?") String question) {
return chatModel.call(new Prompt(question, OpenAiChatOptions.builder()
.withModel("gpt-4-turbo")
.withModel("gpt-4o-mini")
.withTemperature(0.9f)
.withUser("jon.snow")
.withFrequencyPenalty(1.3f)
.build()))
.getResult().getOutput().getContent();
}
Expand Down
2 changes: 0 additions & 2 deletions 02-prompts/prompts-basics-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Prompting using simple text with LLMs via OpenAI.

The application relies on an OpenAI API for providing LLMs.

### When using OpenAI

First, make sure you have an OpenAI account.
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down
2 changes: 0 additions & 2 deletions 02-prompts/prompts-messages-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Prompting using structured messages and roles with LLMs via OpenAI.

The application relies on an OpenAI API for providing LLMs.

### When using OpenAI

First, make sure you have an OpenAI account.
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,9 @@ String chatWithUserMessageTemplate(MusicQuestion question) {
Consider only the musicians that play the {instrument} in that band.
""");
Map<String,Object> model = Map.of("instrument", question.instrument(), "genre", question.genre());
var userMessage = userPromptTemplate.createMessage(model);

var prompt = new Prompt(userMessage);
var prompt = userPromptTemplate.create(model);

var chatResponse = chatModel.call(prompt);
return chatResponse.getResult().getOutput().getContent();
}
Expand Down
2 changes: 0 additions & 2 deletions 02-prompts/prompts-templates-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Prompting using templates with LLMs via OpenAI.

The application relies on an OpenAI API for providing LLMs.

### When using OpenAI

First, make sure you have an OpenAI account.
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,9 @@ String chatWithUserMessageTemplate(MusicQuestion question) {
Consider only the musicians that play the {instrument} in that band.
""");
Map<String,Object> model = Map.of("instrument", question.instrument(), "genre", question.genre());
var userMessage = userPromptTemplate.createMessage(model);

var prompt = new Prompt(userMessage);
var prompt = userPromptTemplate.create(model);

var chatResponse = chatModel.call(prompt);
return chatResponse.getResult().getOutput().getContent();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.converter.ListOutputConverter;
import org.springframework.ai.converter.MapOutputConverter;
import org.springframework.ai.ollama.api.OllamaOptions;
import org.springframework.core.ParameterizedTypeReference;
import org.springframework.core.convert.support.DefaultConversionService;
Expand Down Expand Up @@ -54,7 +55,7 @@ Map<String,Object> chatWithMapOutput(MusicQuestion question) {
.param("instrument", question.instrument())
)
.call()
.entity(new ParameterizedTypeReference<>() {});
.entity(new MapOutputConverter());
}

List<String> chatWithListOutput(MusicQuestion question) {
Expand Down
2 changes: 0 additions & 2 deletions 03-structured-output/structured-output-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Converting the LLM output to structured Java objects via OpenAI.

The application relies on an OpenAI API for providing LLMs.

### When using OpenAI

First, make sure you have an OpenAI account.
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.converter.ListOutputConverter;
import org.springframework.ai.converter.MapOutputConverter;
import org.springframework.ai.openai.OpenAiChatOptions;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.core.ParameterizedTypeReference;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.stereotype.Service;

Expand Down Expand Up @@ -55,7 +55,7 @@ Map<String,Object> chatWithMapOutput(MusicQuestion question) {
.param("instrument", question.instrument())
)
.call()
.entity(new ParameterizedTypeReference<>() {});
.entity(new MapOutputConverter());
}

List<String> chatWithListOutput(MusicQuestion question) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,7 @@ ArtistInfoVariant chatWithJsonOutput(MusicQuestion question) {
""");
Map<String,Object> model = Map.of("instrument", question.instrument(), "genre", question.genre());
var prompt = userPromptTemplate.create(model, OpenAiChatOptions.builder()
.withModel("gpt-4o-2024-08-06")
.withResponseFormat(new OpenAiApi.ChatCompletionRequest.ResponseFormat(OpenAiApi.ChatCompletionRequest.ResponseFormat.Type.JSON_SCHEMA, outputConverter.getJsonSchema()))
.build());

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,5 @@ spring:
api-key: ${OPENAI_API_KEY}
chat:
options:
model: gpt-4o-mini
model: gpt-4o
temperature: 0.7
2 changes: 0 additions & 2 deletions 04-multimodality/multimodality-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Multimodality with LLMs via OpenAI.

The application relies on an OpenAI API for providing LLMs.

### When using OpenAI

First, make sure you have an [OpenAI account](https://platform.openai.com/signup).
Then, define an environment variable with the OpenAI API Key associated to your OpenAI account as the value.

Expand Down
8 changes: 6 additions & 2 deletions 05-function-calling/function-calling-mistral-ai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Function calling via Mistral AI.

The application relies on the Mistral AI API for providing LLMs.

### When using Mistral AI

First, make sure you have a [Mistral AI account](https://console.mistral.ai).
Then, define an environment variable with the Mistral AI API Key associated to your Mistral AI account as the value.

Expand Down Expand Up @@ -35,3 +33,9 @@ Try passing your custom prompt and check the result.
```shell
http :8080/chat/function authorName=="Philip Pullman" -b
```

Try again. This time, the function calling strategy is configured in the call at runtime.

```shell
http :8080/chat/function/explicit authorName=="Philip Pullman" -b
```
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ public class BookService {
books.put(5, new Book("The Silmarillion", "J.R.R. Tolkien"));
}

List<Book> getBooksByAuthor(Author author) {
public List<Book> getBooksByAuthor(Author author) {
return books.values().stream()
.filter(book -> author.name().equals(book.author()))
.toList();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,9 @@ String chat(@RequestParam(defaultValue = "J.R.R. Tolkien") String authorName) {
return chatService.getAvailableBooksBy(authorName);
}

@GetMapping("/chat/function/explicit")
String chatVariant(@RequestParam(defaultValue = "J.R.R. Tolkien") String authorName) {
return chatService.getAvailableBooksByWithExplicitFunction(authorName);
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,11 @@
@Service
class ChatService {

private final BookService bookService;
private final ChatClient chatClient;

ChatService(ChatClient.Builder chatClientBuilder) {
ChatService(BookService bookService, ChatClient.Builder chatClientBuilder) {
this.bookService = bookService;
this.chatClient = chatClientBuilder.build();
}

Expand All @@ -27,4 +29,21 @@ String getAvailableBooksBy(String authorName) {
.content();
}

String getAvailableBooksByWithExplicitFunction(String authorName) {
var userPromptTemplate = "What books written by {author} are available to read?";
return chatClient.prompt()
.user(userSpec -> userSpec
.text(userPromptTemplate)
.param("author", authorName)
)
.function(
"BooksByAuthor",
"Get the list of available books written by the given author",
BookService.Author.class,
bookService::getBooksByAuthor
)
.call()
.content();
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,9 @@ String chat(@RequestParam(defaultValue = "J.R.R. Tolkien") String authorName) {
return chatService.getAvailableBooksBy(authorName);
}

@GetMapping("/chat/function/explicit")
String chatVariant(@RequestParam(defaultValue = "J.R.R. Tolkien") String authorName) {
return chatService.getAvailableBooksByWithExplicitFunction(authorName);
}

}
Original file line number Diff line number Diff line change
@@ -1,10 +1,13 @@
package com.thomasvitale.ai.spring.model;

import com.thomasvitale.ai.spring.BookService;
import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.chat.prompt.PromptTemplate;
import org.springframework.ai.mistralai.MistralAiChatOptions;
import org.springframework.ai.model.function.FunctionCallbackWrapper;
import org.springframework.stereotype.Service;

import java.util.List;
import java.util.Map;
import java.util.Set;

Expand All @@ -14,9 +17,11 @@
@Service
class ChatModelService {

private final BookService bookService;
private final ChatModel chatModel;

ChatModelService(ChatModel chatModel) {
ChatModelService(BookService bookService, ChatModel chatModel) {
this.bookService = bookService;
this.chatModel = chatModel;
}

Expand All @@ -33,4 +38,24 @@ String getAvailableBooksBy(String authorName) {
return chatResponse.getResult().getOutput().getContent();
}

String getAvailableBooksByWithExplicitFunction(String authorName) {
var userPromptTemplate = new PromptTemplate("""
What books written by {author} are available to read?
""");
Map<String,Object> model = Map.of("author", authorName);
var prompt = userPromptTemplate.create(model, MistralAiChatOptions.builder()
.withFunctionCallbacks(List.of(
FunctionCallbackWrapper.builder(bookService::getBooksByAuthor)
.withDescription("Get the list of available books written by the given author")
.withName("BooksByAuthor")
.withInputType(BookService.Author.class)
.withResponseConverter(Object::toString)
.build()
))
.build());

var chatResponse = chatModel.call(prompt);
return chatResponse.getResult().getOutput().getContent();
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ class FunctionCallingMistralAiApplicationTests {
WebTestClient webTestClient;

@ParameterizedTest
@ValueSource(strings = {"/chat/function", "/model/chat/function"})
@ValueSource(strings = {"/chat/function", "/model/chat/function", "/chat/function/explicit", "/model/chat/function/explicit"})
void chat(String path) {
webTestClient
.get()
Expand Down
6 changes: 6 additions & 0 deletions 05-function-calling/function-calling-ollama/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,9 @@ Try passing your custom prompt and check the result.
```shell
http :8080/chat/function authorName=="Philip Pullman" -b
```

Try again. This time, the function calling strategy is configured in the call at runtime.

```shell
http :8080/chat/function/explicit authorName=="Philip Pullman" -b
```
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ public class BookService {
books.put(5, new Book("The Silmarillion", "J.R.R. Tolkien"));
}

List<Book> getBooksByAuthor(Author author) {
public List<Book> getBooksByAuthor(Author author) {
return books.values().stream()
.filter(book -> author.name().equals(book.author()))
.toList();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,9 @@ String chat(@RequestParam(defaultValue = "J.R.R. Tolkien") String authorName) {
return chatService.getAvailableBooksBy(authorName);
}

@GetMapping("/chat/function/explicit")
String chatVariant(@RequestParam(defaultValue = "J.R.R. Tolkien") String authorName) {
return chatService.getAvailableBooksByWithExplicitFunction(authorName);
}

}
Loading

0 comments on commit 272af70

Please sign in to comment.