Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(container)!: Update image quay.io/go-skynet/local-ai to v2 - autoclosed #253

Closed
wants to merge 1 commit into from

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Dec 5, 2023

Mend Renovate

This PR contains the following updates:

Package Update Change
quay.io/go-skynet/local-ai major v1.18.0 -> v2.13.0

Warning

Some dependencies could not be looked up. Check the Dependency Dashboard for more information.


Release Notes

mudler/LocalAI (quay.io/go-skynet/local-ai)

v2.13.0: 🖼️ v2.13.0 - Model gallery edition

Compare Source

Hello folks, Ettore here - I'm happy to announce the v2.13.0 LocalAI release is out, with many features!

Below there is a small breakdown of the hottest features introduced in this release - however - there are many other improvements (especially from the community) as well, so don't miss out the changelog!

Check out the full changelog below for having an overview of all the changes that went in this release (this one is quite packed up).

🖼️ Model gallery

This is the first release with model gallery in the webUI, you can see now a "Model" button in the WebUI which lands now in a selection of models:

output

You can choose now models between stablediffusion, llama3, tts, embeddings and more! The gallery is growing steadly and being kept up-to-date.

The models are simple YAML files which are hosted in this repository: https://github.com/mudler/LocalAI/tree/master/gallery - you can host your own repository with your model index, or if you want you can contribute to LocalAI.

If you want to contribute adding models, you can by opening up a PR in the gallery directory: https://github.com/mudler/LocalAI/tree/master/gallery.

Rerankers

I'm excited to introduce a new backend for rerankers. LocalAI now implements the Jina API (https://jina.ai/reranker/#apiform) as a compatibility layer, and you can use existing Jina clients and point to those to the LocalAI address. Behind the hoods, uses https://github.com/AnswerDotAI/rerankers.

output

You can test this by using container images with python (this does NOT work with core images) and a model config file like this, or by installing cross-encoder from the gallery in the UI:

name: jina-reranker-v1-base-en
backend: rerankers
parameters:
  model: cross-encoder

and test it with:

    curl http://localhost:8080/v1/rerank \
      -H "Content-Type: application/json" \
      -d '{
      "model": "jina-reranker-v1-base-en",
      "query": "Organic skincare products for sensitive skin",
      "documents": [
        "Eco-friendly kitchenware for modern homes",
        "Biodegradable cleaning supplies for eco-conscious consumers",
        "Organic cotton baby clothes for sensitive skin",
        "Natural organic skincare range for sensitive skin",
        "Tech gadgets for smart homes: 2024 edition",
        "Sustainable gardening tools and compost solutions",
        "Sensitive skin-friendly facial cleansers and toners",
        "Organic food wraps and storage solutions",
        "All-natural pet food for dogs with allergies",
        "Yoga mats made from recycled materials"
      ],
      "top_n": 3
    }'

Parler-tts

There is a new backend available for tts now, parler-tts. It is possible to install and configure the model directly from the gallery. https://github.com/huggingface/parler-tts

🎈 Lot of small improvements behind the scenes!

Thanks to our outstanding community, we have enhanced the performance and stability of LocalAI across various modules. From backend optimizations to front-end adjustments, every tweak helps make LocalAI smoother and more robust.

📣 Spread the word!

First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say!

And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community.

Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference.

Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy

Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together!

Thanks a ton, and here's to more exciting times ahead with LocalAI!

What's Changed

Bug fixes 🐛
Exciting New Features 🎉
🧠 Models
📖 Documentation and examples
👒 Dependencies
Other Changes

New Contributors

Full Changelog: mudler/LocalAI@v2.12.4...V2.13.0

v2.12.4

Compare Source

Patch release to include https://github.com/mudler/LocalAI/pull/1985

v2.12.3

Compare Source

I'm happy to announce the v2.12.3 LocalAI release is out!

🌠 Landing page and Swagger

Ever wondered what to do after LocalAI is up and running? Integration with a simple web interface has been started, and you can see now a landing page when hitting the LocalAI front page:

Screenshot from 2024-04-07 14-43-26

You can also now enjoy Swagger to try out the API calls directly:

swagger

🌈 AIO images changes

Now the default model for CPU images is https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF - pre-configured for functions and tools API support!
If you are an Intel-GPU owner, the Intel profile for AIO images is now available too!

🚀 OpenVINO and transformers enhancements

Now there is support for OpenVINO and transformers got token streaming support now thanks to @​fakezeta!

To try OpenVINO, you can use the example available in the documentation: https://localai.io/features/text-generation/#examples

🎈 Lot of small improvements behind the scenes!

Thanks for our outstanding community, we have enhanced several areas:

  • The build time of LocalAI was speed up significantly! thanks to @​cryptk for the efforts in enhancing the build system
  • @​thiner worked hardly to get Vision support for AutoGPTQ
  • ... and much more! see down below for a full list, be sure to star LocalAI and give it a try!

📣 Spread the word!

First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say!

And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community.

Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference.

Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy

Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together!

Thanks a ton, and here's to more exciting times ahead with LocalAI!

What's Changed

Bug fixes 🐛
Exciting New Features 🎉
📖 Documentation and examples
👒 Dependencies
Other Changes

New Contributors

Full Changelog: mudler/LocalAI@v2.11.0...v2.12.3

v2.12.1

Compare Source

I'm happy to announce the v2.12.1 LocalAI release is out!

🌠 Landing page and Swagger

Ever wondered what to do after LocalAI is up and running? Integration with a simple web interface has been started, and you can see now a landing page when hitting the LocalAI front page:

Screenshot from 2024-04-07 14-43-26

You can also now enjoy Swagger to try out the API calls directly:

swagger

🌈 AIO images changes

Now the default model for CPU images is https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF - pre-configured for functions and tools API support!
If you are an Intel-GPU owner, the Intel profile for AIO images is now available too!

🚀 OpenVINO and transformers enhancements

Now there is support for OpenVINO and transformers got token streaming support now thanks to @​fakezeta!

To try OpenVINO, you can use the example available in the documentation: https://localai.io/features/text-generation/#examples

🎈 Lot of small improvements behind the scenes!

Thanks for our outstanding community, we have enhanced several areas:

  • The build time of LocalAI was speed up significantly! thanks to @​cryptk for the efforts in enhancing the build system
  • @​thiner worked hardly to get Vision support for AutoGPTQ
  • ... and much more! see down below for a full list, be sure to star LocalAI and give it a try!

📣 Spread the word!

First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say!

And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community.

Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference.

Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy

Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together!

Thanks a ton, and here's to more exciting times ahead with LocalAI!

What's Changed

Bug fixes 🐛
Exciting New Features 🎉
📖 Documentation and examples
👒 Dependencies
Other Changes

New Contributors

Full Changelog: mudler/LocalAI@v2.11.0...v2.12.1

v2.12.0

Compare Source

I'm happy to announce the v2.12.0 LocalAI release is out!

🌠 Landing page and Swagger

Ever wondered what to do after LocalAI is up and running? Integration with a simple web interface has been started, and you can see now a landing page when hitting the LocalAI front page:

Screenshot from 2024-04-07 14-43-26

You can also now enjoy Swagger to try out the API calls directly:

swagger

🌈 AIO images changes

Now the default model for CPU images is https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF - pre-configured for functions and tools API support!
If you are an Intel-GPU owner, the Intel profile for AIO images is now available too!

🚀 OpenVINO and transformers enhancements

Now there is support for OpenVINO and transformers got token streaming support now thanks to @​fakezeta!

To try OpenVINO, you can use the example available in the documentation: https://localai.io/features/text-generation/#examples

🎈 Lot of small improvements behind the scenes!

Thanks for our outstanding community, we have enhanced several areas:

  • The build time of LocalAI was speed up significantly! thanks to @​cryptk for the efforts in enhancing the build system
  • @​thiner worked hardly to get Vision support for AutoGPTQ
  • ... and much more! see down below for a full list, be sure to star LocalAI and give it a try!

📣 Spread the word!

First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say!

And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community.

Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference.

Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy

Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together!

Thanks a ton, and here's to more exciting times ahead with LocalAI!

What's Changed

Bug fixes 🐛
Exciting New Features 🎉

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Mend Renovate. View repository job log here.

@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 2 times, most recently from 63b4fa6 to d65e090 Compare December 21, 2023 21:54
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 5 times, most recently from 5f27127 to 55cc1d8 Compare December 30, 2023 18:23
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 4 times, most recently from 046aad3 to ae9c7de Compare January 9, 2024 11:35
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 2 times, most recently from 53b96e6 to 94d4b80 Compare January 23, 2024 21:12
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch from 94d4b80 to f56774c Compare January 29, 2024 15:04
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 3 times, most recently from 8ae2e2a to 209d2ed Compare February 16, 2024 03:27
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch from 209d2ed to afbe18e Compare February 24, 2024 15:59
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 2 times, most recently from e33846b to 37c7cfb Compare March 18, 2024 23:41
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch from 37c7cfb to 4562584 Compare March 26, 2024 19:19
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch 4 times, most recently from 780c932 to 4e24149 Compare April 11, 2024 16:59
@renovate renovate bot force-pushed the renovate/quay.io-go-skynet-local-ai-2.x branch from 4e24149 to 6b94d21 Compare April 26, 2024 18:53
@renovate renovate bot changed the title feat(container)!: Update image quay.io/go-skynet/local-ai to v2 feat(container)!: Update image quay.io/go-skynet/local-ai to v2 - autoclosed Apr 28, 2024
@renovate renovate bot closed this Apr 28, 2024
@renovate renovate bot deleted the renovate/quay.io-go-skynet-local-ai-2.x branch April 28, 2024 18:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants