Skip to content

Latest commit

 

History

History
17 lines (16 loc) · 1.59 KB

README.md

File metadata and controls

17 lines (16 loc) · 1.59 KB

Pet-projects with transformer architecture for NLP tasks

image

Implemented projects

  1. Implementation transformer model:
    • Implementation vanilla transformer from encoder-decoder classes to positional encoding, self-attention, multi-head attention, feed-forward network, residual connections and layer normalization.
    • Using WMT 2014 English-German and English-French datasets solve text translation problem.
    • Analyze Label Smoothing in transformer with KLDiv loss.
    • Using Tatoeba Russian-English dataset solve text translation problem.
  2. BERT analyzing:
    • Work with Byte-pair-encoding and Word-piece tokenizers.
    • Visualization of the token relationship inside BERT-model using bertviz.
    • Сlassification of restaurant reviews using ruBert-base finetune model.
  3. NER-problem with BERT: