Sifal commited on
Commit
482a72f
·
verified ·
1 Parent(s): 8c23fc8

fix example

Browse files
Files changed (1) hide show
  1. README.md +9 -5
README.md CHANGED
@@ -76,8 +76,7 @@ Clinical Mosaic was pre-trained on deidentified clinical notes from MIMIC-IV-NOT
76
  Install the Hugging Face Transformers library and load the model as follows:
77
 
78
  ```python
79
- from transformers import AutoTokenizer, AutoModel, AutoModelForSequenceClassification, BertTokenizer, BertConfig
80
- import torch
81
 
82
  tokenizer = BertTokenizer.from_pretrained('bert-base-uncased') # MosaicBERT uses the standard BERT tokenizer
83
  config = BertConfig.from_pretrained('Sifal/ClinicalMosaic') # the config needs to be passed in
@@ -86,14 +85,19 @@ config = BertConfig.from_pretrained('Sifal/ClinicalMosaic') # the config needs t
86
  config.num_labels = 4
87
  config.hidden_size = 768
88
 
89
- ClassifierClincalMosaic = AutoModelForSequenceClassification.from_pretrained('Sifal/ClinicalMosaic', config=config, trust_remote_code=True)
 
 
 
 
 
 
90
 
91
  # Example usage
92
-
93
  clinical_text = "..."
94
 
95
  inputs = tokenizer(clinical_text, return_tensors="pt")
96
- embeddings = ClassifierClincalMosaic(**inputs)
97
  ```
98
 
99
  Further instructions and example scripts are provided in the model’s repository.
 
76
  Install the Hugging Face Transformers library and load the model as follows:
77
 
78
  ```python
79
+ from transformers import AutoModelForSequenceClassification, BertTokenizer, BertConfig
 
80
 
81
  tokenizer = BertTokenizer.from_pretrained('bert-base-uncased') # MosaicBERT uses the standard BERT tokenizer
82
  config = BertConfig.from_pretrained('Sifal/ClinicalMosaic') # the config needs to be passed in
 
85
  config.num_labels = 4
86
  config.hidden_size = 768
87
 
88
+ ClassifierClincalMosaic = AutoModelForSequenceClassification.from_pretrained(
89
+ 'Sifal/ClinicalMosaic',
90
+ config=config,
91
+ torch_dtype='auto',
92
+ trust_remote_code=True,
93
+ device_map="auto"
94
+ )
95
 
96
  # Example usage
 
97
  clinical_text = "..."
98
 
99
  inputs = tokenizer(clinical_text, return_tensors="pt")
100
+ logits = ClassifierClincalMosaic(**inputs).logits
101
  ```
102
 
103
  Further instructions and example scripts are provided in the model’s repository.