Skip to content

Commit

Permalink
Update Text-generation tasks page.
Browse files Browse the repository at this point in the history
  • Loading branch information
Vaibhavs10 committed Nov 22, 2023
1 parent 921533e commit 8414832
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
6 changes: 3 additions & 3 deletions packages/tasks/src/text-generation/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,19 +26,19 @@ A popular variant of Text Generation models predicts the next word given a bunch
- Continue a story given the first sentences.
- Provided a code description, generate the code.

The most popular models for this task are GPT-based models or [Llama series](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf). These models are trained on data that has no labels, so you just need plain text to train your own model. You can train text generation models to generate a wide variety of documents, from code to stories.
The most popular models for this task are GPT-based models, [Mistral](mistralai/Mistral-7B-v0.1) or [Llama series](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf). These models are trained on data that has no labels, so you just need plain text to train your own model. You can train text generation models to generate a wide variety of documents, from code to stories.

### Text-to-Text Generation Models

These models are trained to learn the mapping between a pair of texts (e.g. translation from one language to another). The most popular variants of these models are [FLAN-T5](https://huggingface.co/google/flan-t5-xxl), and [BART](https://huggingface.co/docs/transformers/model_doc/bart). Text-to-Text models are trained with multi-tasking capabilities, they can accomplish a wide range of tasks, including summarization, translation, and text classification.
These models are trained to learn the mapping between a pair of texts (e.g. translation from one language to another). The most popular variants of these models are [NLLB](facebook/nllb-200-distilled-600M), [FLAN-T5](https://huggingface.co/google/flan-t5-xxl), and [BART](https://huggingface.co/docs/transformers/model_doc/bart). Text-to-Text models are trained with multi-tasking capabilities, they can accomplish a wide range of tasks, including summarization, translation, and text classification.

## Inference

You can use the 🤗 Transformers library `text-generation` pipeline to do inference with Text Generation models. It takes an incomplete text and returns multiple outputs with which the text can be completed.

```python
from transformers import pipeline
generator = pipeline('text-generation', model = 'gpt2')
generator = pipeline('text-generation', model = 'HuggingFaceH4/zephyr-7b-beta')
generator("Hello, I'm a language model", max_length = 30, num_return_sequences=3)
## [{'generated_text': "Hello, I'm a language modeler. So while writing this, when I went out to meet my wife or come home she told me that my"},
## {'generated_text': "Hello, I'm a language modeler. I write and maintain software in Python. I love to code, and that includes coding things that require writing"}, ...
Expand Down
2 changes: 1 addition & 1 deletion packages/tasks/src/text-generation/data.ts
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ const taskData: TaskDataCustom = {
],
summary:
"Generating text is the task of producing new text. These models can, for example, fill in incomplete text or paraphrase.",
widgetModels: ["tiiuae/falcon-7b-instruct"],
widgetModels: ["HuggingFaceH4/zephyr-7b-beta"],
youtubeId: "Vpjb1lu0MDk",
};

Expand Down

0 comments on commit 8414832

Please sign in to comment.