Skip to content

Releases: Aananda-giri/GPT2-Nepali

GPT2-Pretrain, Inference

14 Jan 15:59
Compare
Choose a tag to compare
  • complete pretraining code
  • obtained pretrained model
  • load model from huggingface
  • inference through huggingface space
  • (huggingface) added tokenizer along with model code
  • code cleanup.

GPT2: sebastian-gutenberg (500Mb chunk)

13 Nov 11:22
Compare
Choose a tag to compare
# for multiple chunks, we should modify the code from 
for epoch in epochs:
    for chunk in chunks

# instead of 
for chunk in chunks:
    for epoch in epochs:

# will do in future