Skip to content

Commit

Permalink
added the rest of the chapters/sections in chapter 1
Browse files Browse the repository at this point in the history
  • Loading branch information
lazarzivanovicc committed Aug 31, 2024
1 parent 501a2b0 commit b2c6fa9
Show file tree
Hide file tree
Showing 11 changed files with 415 additions and 394 deletions.
38 changes: 19 additions & 19 deletions chapters/sr/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,25 +7,25 @@
sections:
- local: chapter1/1
title: Uvod
# - local: chapter1/2
# title: Natural Language Processing
# - local: chapter1/3
# title: Transformers, what can they do?
# - local: chapter1/4
# title: How do Transformers work?
# - local: chapter1/5
# title: Encoder models
# - local: chapter1/6
# title: Decoder models
# - local: chapter1/7
# title: Sequence-to-sequence models
# - local: chapter1/8
# title: Bias and limitations
# - local: chapter1/9
# title: Summary
# - local: chapter1/10
# title: End-of-chapter quiz
# quiz: 1
- local: chapter1/2
title: Obrada prirodnog jezika
- local: chapter1/3
title: Transformeri, šta mogu da urade?
- local: chapter1/4
title: Kako Transformeri rade?
- local: chapter1/5
title: Enkoder modeli
- local: chapter1/6
title: Dekoder modeli
- local: chapter1/7
title: Sequence-to-sequence modeli
- local: chapter1/8
title: Pristrastnost i limitacije
- local: chapter1/9
title: Rezime
- local: chapter1/10
title: End-of-chapter quiz
quiz: 1

# - title: 2. Using 🤗 Transformers
# sections:
Expand Down
8 changes: 4 additions & 4 deletions chapters/sr/chapter1/1.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Nakon što završite ovaj kurs, preporučujemo da pogledate specijalizaciju Deep

## Ko smo mi?

About the authors:
O autirima:

[**Abubakar Abid**](https://huggingface.co/abidlabs) završio je doktorat na Stanfordu iz primenjenog mašinskog učenja. Tokom svojih doktorskih studija, osnovao je [Gradio](https://github.com/gradio-app/gradio), open-source Python biblioteku koja je korišćena za izradu preko 600,000 demoa mašinskog učenja. Gradio je kasnije akviziran od strane Hugging Face, gde Abubakar sada radi kao vođa tima za mašinsko učenje.

Expand Down Expand Up @@ -98,6 +98,6 @@ Jupyter sveske koje sadrže sav kod iz kursa se nalaze u repozitorijumu [`huggin

Da li ste spremni? U ovom poglavlju naučićete:

* How to use the `pipeline()` function to solve NLP tasks such as text generation and classification
* About the Transformer architecture
* How to distinguish between encoder, decoder, and encoder-decoder architectures and use cases
* Kako da koristite `pipeline()` funkciju da rešite NLP zadatke poput generisanja teksta i klasifikacije
* O Transformerskoj arhitekturi
* Kako da razlikujete izmedju enkoder, dekoder i enkoder/dekoder arhitektura i slučaja kada se koriste
Loading

0 comments on commit b2c6fa9

Please sign in to comment.