RaushanTurganbay HF Staff commited on
Commit
81a21b8
·
verified ·
1 Parent(s): 91abc33

Update pipeline example

Browse files
Files changed (1) hide show
  1. README.md +24 -1
README.md CHANGED
@@ -34,7 +34,30 @@ other versions on a task that interests you.
34
 
35
  ### How to use
36
 
37
- You can load and use the model like following:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
38
  ```python
39
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
40
  import torch
 
34
 
35
  ### How to use
36
 
37
+ To run the model with the `pipeline`, see the below example:
38
+
39
+ ```python
40
+ from transformers import pipeline
41
+
42
+ pipe = pipeline("image-text-to-text", model="llava-hf/llama3-llava-next-8b-hf")
43
+ messages = [
44
+ {
45
+ "role": "user",
46
+ "content": [
47
+ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg"},
48
+ {"type": "text", "text": "What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud"},
49
+ ],
50
+ },
51
+ ]
52
+
53
+ out = pipe(text=messages, max_new_tokens=20)
54
+ print(out)
55
+ >>> [{'input_text': [{'role': 'user', 'content': [{'type': 'image', 'url': 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg'}, {'type': 'text', 'text': 'What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud'}]}], 'generated_text': 'Lava'}]
56
+ ```
57
+
58
+
59
+ You can also load and use the model like following:
60
+
61
  ```python
62
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
63
  import torch