使用示例
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Tommi09/MedicalChatBot-7b-test")
tokenizer = AutoTokenizer.from_pretrained("Tommi09/MedicalChatBot-7b-test")
inputs = tokenizer("气血两虚, 有哪些症状?", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support