EhrBERT is fine-tuned based on BioBERT using 500K and 1M EHR notes. It is developed by the UMASS BioNLP team. Currently it's not available due to the risk of privacy leak. If you are interested in clinical version of BERT, we recommend using ClinicalBERT(https://github.com/EmilyAlsentzer/clinicalBERT) which was trained using public MIMIC data.
Please download the huggingface implementation of BERT.
-
Download the EhrBERT and unzip it into <bert_directory>.
-
Install the environment that the huggingface needs.
-
Taking classification as example, you just need to write one line to load the model.
model = BertForSequenceClassification.from_pretrained(<bert_directory>)
- For more information, you can refer to these examples.