Commit
·
be7e365
1
Parent(s):
cb32f77
Update README.md
Browse files
README.md
CHANGED
|
@@ -55,8 +55,8 @@ You can use this model directly with a pipeline for text generation.
|
|
| 55 |
>>> from transformers import pipeline
|
| 56 |
|
| 57 |
>>> generator = pipeline('text-generation', model="facebook/opt-350m")
|
| 58 |
-
>>> generator("
|
| 59 |
-
[{'generated_text': "
|
| 60 |
```
|
| 61 |
|
| 62 |
By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
|
|
@@ -66,8 +66,8 @@ By default, generation is deterministic. In order to use the top-k sampling, ple
|
|
| 66 |
|
| 67 |
>>> set_seed(32)
|
| 68 |
>>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True)
|
| 69 |
-
>>> generator("
|
| 70 |
-
[{'generated_text': "
|
| 71 |
```
|
| 72 |
|
| 73 |
### Limitations and bias
|
|
|
|
| 55 |
>>> from transformers import pipeline
|
| 56 |
|
| 57 |
>>> generator = pipeline('text-generation', model="facebook/opt-350m")
|
| 58 |
+
>>> generator("What are we having for dinner?")
|
| 59 |
+
[{'generated_text': "What are we having for dinner?\nI'm having a steak and a salad.\nI'm""}]
|
| 60 |
```
|
| 61 |
|
| 62 |
By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
|
|
|
|
| 66 |
|
| 67 |
>>> set_seed(32)
|
| 68 |
>>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True)
|
| 69 |
+
>>> generator("What are we having for dinner?")
|
| 70 |
+
[{'generated_text': "What are we having for dinner?\n\nWith spring fast approaching, it’s only appropriate"}]
|
| 71 |
```
|
| 72 |
|
| 73 |
### Limitations and bias
|