Skip to content

Using HuggingFace Transformers to train a BERT with dynamic mask for Assemble Language, namely Inst2Vec model

License

Notifications You must be signed in to change notification settings

Yuran-Zhao/Inst2Vec

Repository files navigation

Inst2Vec Model

Using HuggingFace Transformers to train a BERT with dynamic mask for Assemble Language from scratch. We name it Inst2Vec for it is designed to generate vectors for assemble instructions.

It is a part of the model introduced in the ICONIP 2021 paper A Hierarchical Graph-based Neural Network for Malware Classification.

The preprocessing procedure can be found in process_data.

You can simply run python train_my_tokenizer.py to obtain an Assemble Tokenizer.

The script I use to train the Inst2Vec1 model is as follows:

python my_run_mlm_no_trainer.py \
    --per_device_train_batch_size 8192 \
    --per_device_eval_batch_size 16384 \
    --num_warmup_steps 4000 --output_dir ./ \
    --seed 1234 --preprocessing_num_workers 32 \
    --max_train_steps 150000 \
    --eval_every_steps 1000

About

Using HuggingFace Transformers to train a BERT with dynamic mask for Assemble Language, namely Inst2Vec model

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages