Mungert commited on
Commit
007d802
·
verified ·
1 Parent(s): 6870074

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -12
README.md CHANGED
@@ -24,38 +24,48 @@ Build llama.cpp as usual : https://github.com/ggml-org/llama.cpp#building-the-pr
24
 
25
  https://huggingface.co/Mungert/gemma-3-4b-it-gguf/tree/main
26
  Choose a gguf file without the mmproj in the name
 
27
 
28
  4. **Download the Gemma 3 mmproj file**
29
 
30
  https://huggingface.co/Mungert/gemma-3-4b-it-gguf/tree/main
31
  Choose a file with mmproj in the name
 
32
 
33
  5. Copy images to the same folder as the gguf files or alter paths appropriately. In the example below the gguf files, images and llama-gemma-cli are in the same folder.
34
 
 
 
35
  5. **Run the CLI Tool**:
36
  ```bash
37
- llama-gemma3-cli -m google_gemma-3-4b-it-q8.gguf --mmproj google_gemma-3-4b-it-mmproj-q8.gguf
38
  ```
39
 
40
-
41
  Running in chat mode, available commands:
42
  /image <path> load an image
43
  /clear clear the chat history
44
  /quit or /exit exit the program
45
-
46
- ```
47
- > hi
48
- Hello! How's it going today?
49
 
50
- Is there something specific on your mind, or were you simply saying hi? 😊
 
 
 
 
 
 
 
 
 
 
51
 
52
- I’m here to chat, answer questions, help with creative tasks, or just listen whatever you need!
 
 
53
 
54
- > /image ./bliss.png
55
- Encoding image ./bliss.png
56
 
57
- > what is that
58
- That's a beautiful image!
59
  ```
60
 
61
 
 
24
 
25
  https://huggingface.co/Mungert/gemma-3-4b-it-gguf/tree/main
26
  Choose a gguf file without the mmproj in the name
27
+ example gguf file : https://huggingface.co/Mungert/gemma-3-4b-it-gguf/resolve/main/google_gemma-3-4b-it-q4_k_l.gguf
28
 
29
  4. **Download the Gemma 3 mmproj file**
30
 
31
  https://huggingface.co/Mungert/gemma-3-4b-it-gguf/tree/main
32
  Choose a file with mmproj in the name
33
+ example mmproj file : https://huggingface.co/Mungert/gemma-3-4b-it-gguf/resolve/main/google_gemma-3-4b-it-mmproj-bf16.gguf
34
 
35
  5. Copy images to the same folder as the gguf files or alter paths appropriately. In the example below the gguf files, images and llama-gemma-cli are in the same folder.
36
 
37
+ get an example image: image https://huggingface.co/Mungert/gemma-3-4b-it-gguf/resolve/main/car-1.jpg
38
+
39
  5. **Run the CLI Tool**:
40
  ```bash
41
+ llama-gemma3-cli -m google_gemma-3-4b-it-q4_k_l.gguf --mmproj google_gemma-3-4b-it-mmproj-bf16.gguf
42
  ```
43
 
44
+ ```
45
  Running in chat mode, available commands:
46
  /image <path> load an image
47
  /clear clear the chat history
48
  /quit or /exit exit the program
 
 
 
 
49
 
50
+ > /image car-1.jpg
51
+ Encoding image car-1.jpg
52
+ Image encoded in 46305 ms
53
+ Image decoded in 19302 ms
54
+
55
+ > what is the image of
56
+ Here's a breakdown of what's in the image:
57
+
58
+ **Subject:** The primary subject is a black Porsche Panamera Turbo driving on a highway.
59
+
60
+ **Details:**
61
 
62
+ * **Car:** It's a sleek, modern Porsche Panamera Turbo, identifiable by its distinctive rear design, the "PORSCHE" lettering, and the "Panamera Turbo" badge. The license plate reads "CVC-911".
63
+ * **Setting:** The car is on a multi-lane highway, with a blurred background of trees, a distant building, and a cloudy sky. The lighting suggests it's either dusk or dawn.
64
+ * **Motion:** The image captures the car in motion, with a slight motion blur to convey speed.
65
 
66
+ **Overall Impression:** The image conveys a sense of speed, luxury, and power. It's a well-composed shot that highlights the car's design and performance.
 
67
 
68
+ Do you want me to describe any specific aspect of the image in more detail, or perhaps analyze its composition?
 
69
  ```
70
 
71