Sifal commited on
Commit
8c23fc8
·
verified ·
1 Parent(s): 9671a15

add classification example

Browse files
Files changed (1) hide show
  1. README.md +17 -3
README.md CHANGED
@@ -76,10 +76,24 @@ Clinical Mosaic was pre-trained on deidentified clinical notes from MIMIC-IV-NOT
76
  Install the Hugging Face Transformers library and load the model as follows:
77
 
78
  ```python
79
- from transformers import AutoTokenizer, AutoModel
 
80
 
81
- tokenizer = AutoTokenizer.from_pretrained("path/to/clinical-mosaic")
82
- model = AutoModel.from_pretrained("path/to/clinical-mosaic")
 
 
 
 
 
 
 
 
 
 
 
 
 
83
  ```
84
 
85
  Further instructions and example scripts are provided in the model’s repository.
 
76
  Install the Hugging Face Transformers library and load the model as follows:
77
 
78
  ```python
79
+ from transformers import AutoTokenizer, AutoModel, AutoModelForSequenceClassification, BertTokenizer, BertConfig
80
+ import torch
81
 
82
+ tokenizer = BertTokenizer.from_pretrained('bert-base-uncased') # MosaicBERT uses the standard BERT tokenizer
83
+ config = BertConfig.from_pretrained('Sifal/ClinicalMosaic') # the config needs to be passed in
84
+
85
+ # Set the hidden size and number of labels:
86
+ config.num_labels = 4
87
+ config.hidden_size = 768
88
+
89
+ ClassifierClincalMosaic = AutoModelForSequenceClassification.from_pretrained('Sifal/ClinicalMosaic', config=config, trust_remote_code=True)
90
+
91
+ # Example usage
92
+
93
+ clinical_text = "..."
94
+
95
+ inputs = tokenizer(clinical_text, return_tensors="pt")
96
+ embeddings = ClassifierClincalMosaic(**inputs)
97
  ```
98
 
99
  Further instructions and example scripts are provided in the model’s repository.