Safetensors
GGUF
llama

使用示例

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Tommi09/MedicalChatBot-7b-test")
tokenizer = AutoTokenizer.from_pretrained("Tommi09/MedicalChatBot-7b-test")

inputs = tokenizer("气血两虚, 有哪些症状?", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
Downloads last month
0
Safetensors
Model size
6.91B params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train Tommi09/MedicalChatBot-7b-test

Space using Tommi09/MedicalChatBot-7b-test 1