Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Commit message gen: Use user input as starting point #4069

Closed
wants to merge 3 commits into from

Conversation

estib-vega
Copy link
Contributor

Related to #4060

Flag to have the commit generation only improve or reword the given user input.

  • The flag is persisted just as the the brevity and emoji use
  • Add types prompt parameters and directives for improved type guarding and dynamic meta-prompt generation
  • Add directive-specific meta-prompting for Ollama client.

improve-user-input

Also:

  • Add branded types util (this might be just as well moved to the typeguards.ts)
  • Gracefully handle Network Errors on the Ollama Client (e.g. when the Ollama server is not running)
Screenshot 2024-06-11 at 10 00 33

@estib-vega
Copy link
Contributor Author

A bit more on the dynamic meta-prompt generation

In order to get the smaller LMs to only reword/extend the given user input, they needed to be provided with specific meta-prompts again.
I tried adding those simply to the array of PromptMessages, but the generation became (a bit but noticeably) slower and not as accurate, in my limited testing at least.

That's why I came up with this approach:
Depending on the directives given (generate from scratch or edit from user input), dynamically build the array of PromptMessages.

This changes do not interfere with the ability to customize the prompts on the application settings.

@krlvi
Copy link
Member

krlvi commented Jun 11, 2024

hey, thanks for this contribution! I didn't tag #4060 properly at the time - it's probably an interesting idea, and im curious to try it out.
I will take a proper look / merge tomorrow, we wanna first make a bugfix release with some performance things tonight

@estib-vega
Copy link
Contributor Author

hey, thanks for this contribution! I didn't tag #4060 properly at the time - it's probably an interesting idea, and im curious to try it out. I will take a proper look / merge tomorrow, we wanna first make a bugfix release with some performance things tonight

All good, no rush 😉

Type utility for type branding
If there is a Network Error thrown while trying to access the Olllama LLMs, catch that and display a more actionable error.
Flag to have the commit generation only improve or reword the given user input.

- The flag is persisted just as the the brevity and emoji use
- Add types prompt parameters and directives for improved type guarding and dynamic meta-prompt generation
- Add directive-specific meta-prompting for Ollama client.
@estib-vega
Copy link
Contributor Author

I've made is a bit more strict about how this treats the starting message.
It will now attempt to leave it as is and only adding text to it.

It still seems to be very probabilistic with Llama 3 8B. Not sure how it behaves with bigger models
commit-improvement

@estib-vega estib-vega closed this Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants