DeepClassifier is a python package based on pytorch, which is easy-use and general for text classification task. You can install DeepClassifier by pip install -U deepclassifier
。
If you want to know more information about DeepClassifier, please see the documentation. So let's start!🤩
If you think DeepClassifier is good, please star and fork it to give me motivation to continue maintenance!🤩 And it's my pleasure that if Deepclassifier is helpful to you!🥰
Just like other Python packages, DeepClassifier also can be installed through pip.The command of installation is pip install -U deepclassifier
.
Here is a list of models that have been integrated into DeepClassifier. In the future, we will integrate more models into DeepClassifier. Welcome to join us!🤩
- TextCNN: Convolutional Neural Networks for Sentence Classification ,2014 EMNLP
- RCNN: Recurrent Convolutional Neural Networks for Text Classification,2015,IJCAI
- DPCNN: Deep Pyramid Convolutional Neural Networks for Text Categorization ,2017,ACL
- HAN: Hierarchical Attention Networks for Document Classification, 2016,ACL
- BERT: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,2018, ACL
- BertTextCNN: BERT+TextCNN
- BertRCNN: BERT+RCNN
- BertDPCNN: BERT+DPCNN
- BertHAN: BERT+HAN ...
I wiil show you that how to use DeepClassifier below.🥰 Click [here] to display the complete code.
you can define model like that(take BertTextCNN model as example):👇
from deepclassifier.models import BertTextCNN
# parameters of model
embedding_dim = 768 # if you use bert, the default is 768.
dropout_rate = 0.2
num_class = 2
bert_path = "/Users/codewithzichao/Desktop/bert-base-uncased/"
my_model = BertTextCNN(embedding_dim=embedding_dim,
dropout_rate=dropout_rate,
num_class=num_class,
bert_path=bert_path)
optimizer = optim.Adam(my_model.parameters())
loss_fn = nn.CrossEntropyLoss()
After defining model, you can train/test/predict model like that:👇
from deepclassifier.trainers import Trainer
model_name = "berttextcnn"
save_path = "best.ckpt"
writer = SummaryWriter("logfie/1")
max_norm = 0.25
eval_step_interval = 20
my_trainer =Trainer(model_name=model_name,model=my_model,
train_loader=train_loader,dev_loader=dev_loader,
test_loader=test_loader, optimizer=optimizer,
loss_fn=loss_fn,save_path=save_path, epochs=1,
writer=writer, max_norm=max_norm,
eval_step_interval=eval_step_interval)
# training
my_trainer.train()
# print the best F1 value on dev set
print(my_trainer.best_f1)
# testing
p, r, f1 = my_trainer.test()
print(p, r, f1)
# predict
pred_data = DataLoader(pred_data, batch_size=1)
pred_label = my_trainer.predict(pred_data)
print(pred_label)
If you want any questions about DeepClassifier, welcome to submit issue or pull requests! And welcome to communicate with me through [email protected].🥳
@misc{zichao2020deepclassifier,
author = {Zichao Li},
title = {DeepClassifier: use-friendly and flexiable package of NLP based text classification models},
year = {2020},
publisher = {GitHub},
journal = {GitHub Repository},
howpublished = {\url{https://github.com/codewithzichao/DeepClassifier}},
}