Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump release #668

Merged
merged 795 commits into from
Jan 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
795 commits
Select commit Hold shift + click to select a range
b374b42
translate no.24
maybenotime Feb 19, 2023
aaee39f
review 06 cn translations
iCell Feb 19, 2023
f001423
review 07 cn translations
iCell Feb 19, 2023
bf69b15
Update 23_what-is-dynamic-padding.srt
nuass Feb 20, 2023
ca1ee22
Update 23_what-is-dynamic-padding.srt
nuass Feb 20, 2023
9ac39b0
Update 23_what-is-dynamic-padding.srt
nuass Feb 20, 2023
eaf6336
Update subtitles/zh-CN/23_what-is-dynamic-padding.srt
nuass Feb 21, 2023
593070d
Update subtitles/zh-CN/23_what-is-dynamic-padding.srt
nuass Feb 21, 2023
7731b66
add blank
maybenotime Feb 22, 2023
3b15333
Review No. 11, No. 12
bon-qi Feb 24, 2023
af22e13
Review No. 13
bon-qi Feb 24, 2023
73f5a98
Review No. 12
bon-qi Feb 24, 2023
33d4748
Review No. 14
bon-qi Feb 24, 2023
ace04ee
Merge pull request #512 from nuass/main
xianbaoqian Feb 27, 2023
d757b64
Merge pull request #508 from iCell/shawn/review-07
xianbaoqian Feb 27, 2023
dfcf449
Merge pull request #509 from maybenotime/my_translate
xianbaoqian Feb 27, 2023
df3e0cf
Merge pull request #506 from iCell/shawn/review-06
xianbaoqian Feb 27, 2023
e1593ee
Merge pull request #505 from iCell/shawn/review-05
xianbaoqian Feb 27, 2023
35b49aa
finished review
tyisme614 Feb 27, 2023
0077839
optimized translation
tyisme614 Feb 28, 2023
3d45b8b
optimized translation
tyisme614 Feb 28, 2023
5bf5512
docs(zh-cn): Reviewed No. 29 - Write your training loop in PyTorch
FYJNEVERFOLLOWS Mar 1, 2023
5e9a4dd
Review 15
bon-qi Mar 1, 2023
edcd017
Review 16
bon-qi Mar 2, 2023
8c09b21
Review 17
bon-qi Mar 2, 2023
7ff227a
Review 18
bon-qi Mar 2, 2023
adb4834
Review ch 72 translation
zhangchaosd Mar 5, 2023
b5202b6
Update 72 cn translation
zhangchaosd Mar 5, 2023
22ae117
To be reviewed No.42-No.54
bon-qi Mar 6, 2023
b7f0272
No.11 check-out
bon-qi Mar 6, 2023
91658cf
No.12 check-out
bon-qi Mar 6, 2023
ef56aa2
No. 13 14 check-out
bon-qi Mar 6, 2023
a37f421
No. 15 16 check-out
bon-qi Mar 6, 2023
c2bace6
No. 17 18 check-out
bon-qi Mar 6, 2023
97c8493
Add note for "token-*"
bon-qi Mar 6, 2023
c68f8d9
Reviewed No.8, 9, 10
bon-qi Mar 6, 2023
0c8cb97
Reviewed No.42
bon-qi Mar 7, 2023
f9678cc
Review No.43
bon-qi Mar 7, 2023
3818b03
finished review
tyisme614 Mar 8, 2023
0f69de8
optimized translation
tyisme614 Mar 8, 2023
6cb8ba2
finished review
tyisme614 Mar 9, 2023
e295e89
optimized translation
tyisme614 Mar 9, 2023
b1b5794
Review 44(need refine)
bon-qi Mar 10, 2023
7849039
Review 45(need refine)
bon-qi Mar 10, 2023
8589c1a
Review No. 46 (need refine)
bon-qi Mar 10, 2023
e7adb34
Review No.47
bon-qi Mar 10, 2023
c8045f7
Review No.46
bon-qi Mar 10, 2023
373fe71
Review No.45
bon-qi Mar 10, 2023
ccfb507
Review No.44
bon-qi Mar 10, 2023
acab389
Review No.48
bon-qi Mar 10, 2023
3c72012
Review No.49
bon-qi Mar 10, 2023
254cf77
Review No.50
bon-qi Mar 10, 2023
2d03f08
Modify Ko chapter2 8.mdx (#465)
nsbg Mar 10, 2023
146bdea
Fixed typo (#471)
tkburis Mar 10, 2023
3842502
fixed subtitle errors (#474)
tyisme614 Mar 10, 2023
96aa135
Fixed a typo (#475)
gdacciaro Mar 10, 2023
ca81c80
Update 3.mdx (#526)
carlos-aguayo Mar 10, 2023
32bfdff
[zh-TW] Added chapters 1-9 (#477)
ateliershen Mar 10, 2023
cff8856
finished review
tyisme614 Mar 10, 2023
1d92a90
Explain why there are more tokens, than reviews (#476)
pavel-nesterov Mar 10, 2023
92671bc
[RU] Subtitles for Chapter 1 of the video course (#489)
artyomboyko Mar 10, 2023
305f0b1
Review No.52
bon-qi Mar 10, 2023
5dfcc95
[ru] Added the glossary and translation guide (#490)
501Good Mar 10, 2023
d229ff7
[ru] Chapters 0 and 1 proofreading, updating and translating missing …
501Good Mar 10, 2023
33ace99
Review No.51
bon-qi Mar 11, 2023
40d54e0
Review No.53
bon-qi Mar 11, 2023
25c44d3
Review No.54
bon-qi Mar 11, 2023
b0c60c8
finished review
tyisme614 Mar 11, 2023
b81337f
modified translation
tyisme614 Mar 11, 2023
2435416
modified translation
tyisme614 Mar 11, 2023
ac8a0ab
modified subtitle
tyisme614 Mar 11, 2023
f1500d8
Merge branch 'main' of github.com:FYJNEVERFOLLOWS/huggingface-course
FYJNEVERFOLLOWS Mar 13, 2023
722dd7e
translated
FYJNEVERFOLLOWS Mar 13, 2023
0bc7dc0
Fix typo (#532)
jybarnes Mar 17, 2023
0ffbef4
review chapter4/2
gxy-gxy Mar 19, 2023
46ae77c
review chapter4/2
gxy-gxy Mar 19, 2023
2c4bea8
review chapter4/2
gxy-gxy Mar 19, 2023
b6a9632
Review 75
bon-qi Mar 22, 2023
3f6515a
Review No.20, need review some
bon-qi Mar 22, 2023
72c1f04
docs(zh-cn): Reviewed Chapter 7/1
jinyouzhi Mar 22, 2023
b77a8c5
Update 1.mdx
jinyouzhi Mar 22, 2023
7501ef4
Review No.22
bon-qi Mar 22, 2023
b1623c6
Review No.21 (need refinement)
bon-qi Mar 22, 2023
3c56c95
Review No.30, need review: 26 27 28 30 73 74
bon-qi Mar 23, 2023
007c858
Review 30 (good)
bon-qi Mar 23, 2023
e4f6434
Review 20
bon-qi Mar 23, 2023
c346c80
Review 21 (refine)
bon-qi Mar 23, 2023
996dc89
Review 21
bon-qi Mar 24, 2023
f832a4d
Review 22
bon-qi Mar 24, 2023
fe4c7d7
Review 26
bon-qi Mar 24, 2023
4140a21
Review 27
bon-qi Mar 24, 2023
238196c
Review 28
bon-qi Mar 24, 2023
0626ce6
Review 30
bon-qi Mar 24, 2023
071272c
Review 73
bon-qi Mar 24, 2023
1f3ab61
Review 74
bon-qi Mar 24, 2023
cfc456b
Fix typo
vhch Mar 28, 2023
0d13c12
Review 26-28, 42-54, 73-75
bon-qi Apr 1, 2023
21e6e6b
The GPT2 link is broken
tsureshkumar Apr 3, 2023
14067bb
typo in `Now your turn!` section
feeeper Apr 4, 2023
1d5471a
`chunk_size` should be instead of `block_size`
feeeper Apr 9, 2023
98218c6
Merge pull request #542 from Vermillion-de/main
xianbaoqian Apr 10, 2023
f381a75
Merge pull request #534 from gxy-gxy/main
xianbaoqian Apr 10, 2023
af00e34
Merge pull request #536 from jinyouzhi/course_review
xianbaoqian Apr 10, 2023
9254fb6
Merge branch 'main' into 0313
xianbaoqian Apr 10, 2023
1db1185
Merge pull request #531 from FYJNEVERFOLLOWS/0313
xianbaoqian Apr 10, 2023
097427c
Merge pull request #529 from tyisme614/review_ep41
xianbaoqian Apr 10, 2023
632ac11
Merge pull request #528 from tyisme614/review_ep40
xianbaoqian Apr 10, 2023
acba72a
Merge pull request #527 from tyisme614/review_ep39
xianbaoqian Apr 10, 2023
a19892b
Merge pull request #525 from tyisme614/review_ep38
xianbaoqian Apr 10, 2023
17df4bf
Merge pull request #520 from zhangchaosd/main
xianbaoqian Apr 10, 2023
5b077d8
Merge pull request #522 from tyisme614/review_ep37
xianbaoqian Apr 10, 2023
f631679
Merge pull request #515 from tyisme614/review_ep35
xianbaoqian Apr 11, 2023
aca889a
Merge pull request #530 from tyisme614/optimize_en_ep41
xianbaoqian Apr 11, 2023
de2cb27
refactor: rephrase text to improve clarity and specificity
Pranav-Bobde Apr 24, 2023
e5a8fcf
Demo link fixes (#562)
MKhalusova May 10, 2023
cccc2c9
Bump release (#566)
lewtun May 10, 2023
9c44804
Revert "Bump release (#566)" (#567)
lewtun May 10, 2023
86f4396
updated documentation links
nnoboa Jun 4, 2023
0018bb4
[doc build] Use secrets (#581)
mishig25 Jun 9, 2023
ceef93d
docs: fix broken links
vipulaSD Jun 15, 2023
f6ded40
changed 'perspires' to 'persists' in chapter 1 quiz
abzdel Jun 22, 2023
333d7fe
Update 4.mdx
JieShenAI Jun 24, 2023
b3cfa9c
Update 4.mdx : Fix Typo
Aug 5, 2023
47427cf
Fix chapter1/5 old documentation links
osanseviero Aug 6, 2023
77df728
fix link
dawoodkhan82 Sep 19, 2023
2af4831
Update 2.mdx
Sookeyy-12 Sep 20, 2023
b88cae1
Update 2.mdx
Sookeyy-12 Sep 20, 2023
52ec6a9
Update 2.mdx
Sookeyy-12 Sep 20, 2023
812f00a
Update 2.mdx
Sookeyy-12 Sep 20, 2023
5a99fa2
Update 2.mdx
Sookeyy-12 Sep 20, 2023
0e84926
Update 2.mdx
Sookeyy-12 Sep 20, 2023
82e991b
Update 2.mdx
Sookeyy-12 Sep 20, 2023
7c0ee67
Update 2.mdx
Sookeyy-12 Sep 20, 2023
5cb46d9
Update 2.mdx
Sookeyy-12 Sep 20, 2023
9093131
Update 2.mdx
Sookeyy-12 Sep 20, 2023
e003d52
Update 2.mdx
Sookeyy-12 Sep 20, 2023
b299376
Update 2.mdx
Sookeyy-12 Sep 20, 2023
02e8159
Fix syntax in vi/chapter7/7.mdx
mishig25 Sep 24, 2023
f1345db
Merge pull request #618 from huggingface/mishig25-patch-3
mishig25 Sep 24, 2023
d1ff989
Merge pull request #614 from huggingface/blocks-events-link
mishig25 Sep 24, 2023
df6be57
Fixed the broken link to the loading datasets page
osanseviero Sep 25, 2023
adf9b1a
Remove `get_lr()` from logs which refers to nonexistent function
bwindsor22 Oct 3, 2023
e7d45fb
Update 4.mdx
paschembri Oct 15, 2023
5c0b7bb
Update en-version
paschembri Oct 15, 2023
12558d7
fix: remove useless token
rtrompier Oct 19, 2023
3423281
fix: remove useless token (#635)
rtrompier Oct 19, 2023
5828ea1
Translate Chapter 3 to Spanish (#510)
mariagrandury Oct 23, 2023
1983457
Translating Chapter 6 to Spanish (#523)
datacubeR Oct 23, 2023
0916d0a
Update 5.mdx
k3ybladewielder Oct 24, 2023
ce2b5e7
Merge pull request #586 from abzdel/patch-1
merveenoyan Nov 22, 2023
e6fadfa
Merge pull request #587 from JieShenAI/patch-2
merveenoyan Nov 22, 2023
1447b5b
Merge pull request #598 from SingularityGuy/SingularityGuy-patch-1
merveenoyan Nov 22, 2023
b004f50
Merge pull request #540 from vhch/main
merveenoyan Nov 22, 2023
df2fa29
Merge pull request #543 from tsureshkumar/patch-1
merveenoyan Nov 22, 2023
3d53e6f
Merge pull request #264 from kambizG/main
merveenoyan Nov 22, 2023
632b16e
Merge pull request #558 from Pranav-Bobde/patch-1
merveenoyan Nov 22, 2023
7aab892
Update doc CI (#643)
lewtun Dec 5, 2023
3def036
Фиксация текущих результатов.
artyomboyko Dec 12, 2023
2857120
Fix translation
osanseviero Dec 14, 2023
cfb9e62
Removed judgmental arguments
osanseviero Dec 14, 2023
1970e1e
Remove get_lr() from logs which refers to nonexistent function from b…
osanseviero Dec 14, 2023
5097c17
Фиксирую текущее состояние.
artyomboyko Dec 16, 2023
ab2a8b5
Fixing the transfer results for today.
artyomboyko Dec 17, 2023
11f30d1
Translated files 3b and partially 4. Fixing the result.
artyomboyko Dec 18, 2023
6450802
Fixing today's translation.
artyomboyko Dec 19, 2023
be9d376
fix typos in Spanish translation (#511)
mariagrandury Dec 20, 2023
198d352
Fixing today's translation. Files: 6.mdx, 7.mdx and half of 8.mdx.
artyomboyko Dec 20, 2023
16f9009
Merge pull request #544 from feeeper/patch-1
osanseviero Dec 20, 2023
60f7702
Merge pull request #551 from feeeper/patch-2
osanseviero Dec 20, 2023
4c8adfa
Merge pull request #582 from vipulaSD/patch-2
osanseviero Dec 20, 2023
c27def4
Merge branch 'huggingface:main' into main
artyomboyko Dec 21, 2023
3113713
The translation of chapter 6 has been completed.
artyomboyko Dec 21, 2023
2be3db1
Delete chapters/en/.ipynb_checkpoints/_toctree-checkpoint.yml
artyomboyko Dec 21, 2023
32c9ad0
Delete chapters/en/chapter5/.ipynb_checkpoints/8-checkpoint.mdx
artyomboyko Dec 21, 2023
72e6779
Delete chapters/en/chapter6/.ipynb_checkpoints/1-checkpoint.mdx
artyomboyko Dec 21, 2023
c2f871b
Delete chapters/en/chapter6/.ipynb_checkpoints/2-checkpoint.mdx
artyomboyko Dec 21, 2023
ce3ac4d
Delete chapters/en/chapter6/.ipynb_checkpoints/8-checkpoint.mdx
artyomboyko Dec 21, 2023
8f7520a
Delete chapters/en/chapter6/.ipynb_checkpoints/9-checkpoint.mdx
artyomboyko Dec 21, 2023
73855b0
Delete chapters/ru/.ipynb_checkpoints/TRANSLATING-checkpoint.txt
artyomboyko Dec 21, 2023
02395c1
Delete chapters/ru/.ipynb_checkpoints/_toctree-checkpoint.yml
artyomboyko Dec 21, 2023
849d5dd
Delete chapters/ru/chapter5/.ipynb_checkpoints/8-checkpoint.mdx
artyomboyko Dec 21, 2023
be33220
Update 10.mdx
artyomboyko Dec 21, 2023
e9552b0
Update 10.mdx
artyomboyko Dec 21, 2023
5cffa31
Update 10.mdx
artyomboyko Dec 21, 2023
d11fc34
Update chapters/ru/chapter6/4.mdx
artyomboyko Dec 22, 2023
ccbae71
Update chapters/ru/chapter6/4.mdx
artyomboyko Dec 22, 2023
22bde78
Update chapters/ru/chapter6/3.mdx
artyomboyko Dec 22, 2023
eaafdc5
Update chapters/ru/chapter6/3.mdx
artyomboyko Dec 22, 2023
ea57588
Update chapters/ru/chapter6/3b.mdx
artyomboyko Dec 22, 2023
0d34014
Update chapters/ru/chapter6/3.mdx
artyomboyko Dec 22, 2023
8a9bbbc
Update 3.mdx
artyomboyko Dec 22, 2023
b5b2da8
Update 7.mdx
artyomboyko Dec 22, 2023
c67bdb0
Update 3.mdx
artyomboyko Dec 22, 2023
4b4f711
Update chapters/ru/chapter6/3b.mdx
artyomboyko Dec 22, 2023
f00418f
Update chapters/ru/chapter6/5.mdx
artyomboyko Dec 25, 2023
2c733c2
Merge pull request #647 from blademoon/main
MKhalusova Jan 8, 2024
2bf5df9
Completed the translation of the first part of Chapter 7 into Russian.
artyomboyko Jan 9, 2024
3fda9eb
After run python utils/code_formatter.py
artyomboyko Jan 9, 2024
c624701
Update chapters/ru/chapter7/1.mdx
artyomboyko Jan 10, 2024
37cd612
Update chapters/ru/chapter7/2.mdx
artyomboyko Jan 10, 2024
7a87311
Update chapters/ru/chapter7/2.mdx
artyomboyko Jan 10, 2024
c437945
Update chapters/ru/chapter7/2.mdx
artyomboyko Jan 10, 2024
2626bc5
Update chapters/ru/chapter7/5.mdx
artyomboyko Jan 10, 2024
cb440a6
Update chapters/ru/chapter7/5.mdx
artyomboyko Jan 10, 2024
068217e
Update chapters/ru/chapter7/5.mdx
artyomboyko Jan 10, 2024
fa12024
Update 5.mdx
artyomboyko Jan 10, 2024
07373e6
Update chapters/ru/chapter7/4.mdx
artyomboyko Jan 10, 2024
95bda7c
Update 2.mdx
artyomboyko Jan 10, 2024
913a9b1
Update 2.mdx
artyomboyko Jan 10, 2024
5f5d4aa
Update 3.mdx
artyomboyko Jan 10, 2024
30ca446
Update 3.mdx
artyomboyko Jan 10, 2024
8c57d55
Update chapters/ru/chapter7/3.mdx
artyomboyko Jan 10, 2024
e1993b5
Update chapters/ru/chapter7/3.mdx
artyomboyko Jan 10, 2024
c2ad2f6
Update chapters/ru/chapter7/3.mdx
artyomboyko Jan 10, 2024
1a19575
Update 3.mdx
artyomboyko Jan 10, 2024
743ea97
Update chapters/ru/chapter7/3.mdx
artyomboyko Jan 10, 2024
184cceb
Update chapters/ru/chapter7/4.mdx
artyomboyko Jan 10, 2024
5eb4f4b
Update 4.mdx
artyomboyko Jan 10, 2024
e7c59a5
Update 5.mdx
artyomboyko Jan 10, 2024
e09613a
Update 5.mdx
artyomboyko Jan 11, 2024
489dcd7
Merge pull request #653 from blademoon/main
MKhalusova Jan 11, 2024
efbd59f
fixed links to other chapters
MKhalusova Jan 11, 2024
0b03f5c
fixed links to chapters' intros
MKhalusova Jan 11, 2024
8f0e044
I added myself to the Languages and translations table.
artyomboyko Jan 13, 2024
5da07e8
Deleted unnecessary folder automatically created by JupyterLab.
artyomboyko Jan 13, 2024
bc2832e
Merge pull request #658 from blademoon/main
MKhalusova Jan 15, 2024
4966980
Fix links to HF docs
mariosasko Jan 16, 2024
2898462
Merge pull request #660 from huggingface/fix-doc-links
mariosasko Jan 17, 2024
fd85628
Finalizing the translation of chapter 7.
artyomboyko Jan 18, 2024
0afa500
Update 6.mdx
artyomboyko Jan 18, 2024
4a5a73d
Update 7.mdx
artyomboyko Jan 18, 2024
386f429
Merge pull request #656 from MKhalusova/links-fix
MKhalusova Jan 19, 2024
818c74c
Update chapters/ru/chapter7/6.mdx
artyomboyko Jan 19, 2024
ee2e1ce
Update chapters/ru/chapter7/6.mdx
artyomboyko Jan 19, 2024
45a369c
Update chapters/ru/chapter7/6.mdx
artyomboyko Jan 19, 2024
5863024
Update chapters/ru/chapter7/7.mdx
artyomboyko Jan 19, 2024
074d4c5
Update chapters/ru/chapter7/6.mdx
artyomboyko Jan 19, 2024
3e9107e
Update chapters/ru/chapter7/7.mdx
artyomboyko Jan 19, 2024
8e74a57
Update chapters/ru/chapter7/7.mdx
artyomboyko Jan 19, 2024
18b0fdf
Update chapters/ru/chapter7/8.mdx
artyomboyko Jan 19, 2024
5bfa31b
Update 7.mdx
artyomboyko Jan 19, 2024
9af6080
Update 6.mdx
artyomboyko Jan 19, 2024
f667bea
Update chapters/ru/chapter7/7.mdx
artyomboyko Jan 19, 2024
844825d
Update 6.mdx
artyomboyko Jan 19, 2024
991c4ad
Update chapters/ru/chapter7/6.mdx
artyomboyko Jan 19, 2024
9ca1013
Update chapters/ru/chapter7/7.mdx
artyomboyko Jan 19, 2024
3540d5f
Update chapters/ru/chapter7/6.mdx
artyomboyko Jan 19, 2024
6c1f1f8
Merge pull request #661 from blademoon/main
MKhalusova Jan 19, 2024
9be58ed
Merge branch 'release' into bump_release
lewtun Jan 29, 2024
fac4d13
Fix style
lewtun Jan 29, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ This repo contains the content that's used to create the **[Hugging Face course]
| [Japanese](https://huggingface.co/course/ja/chapter1/1) (WIP) | [`chapters/ja`](https://github.com/huggingface/course/tree/main/chapters/ja) | [@hiromu166](https://github.com/@hiromu166), [@younesbelkada](https://github.com/@younesbelkada), [@HiromuHota](https://github.com/@HiromuHota) |
| [Korean](https://huggingface.co/course/ko/chapter1/1) (WIP) | [`chapters/ko`](https://github.com/huggingface/course/tree/main/chapters/ko) | [@Doohae](https://github.com/Doohae), [@wonhyeongseo](https://github.com/wonhyeongseo), [@dlfrnaos19](https://github.com/dlfrnaos19), [@nsbg](https://github.com/nsbg) |
| [Portuguese](https://huggingface.co/course/pt/chapter1/1) (WIP) | [`chapters/pt`](https://github.com/huggingface/course/tree/main/chapters/pt) | [@johnnv1](https://github.com/johnnv1), [@victorescosta](https://github.com/victorescosta), [@LincolnVS](https://github.com/LincolnVS) |
| [Russian](https://huggingface.co/course/ru/chapter1/1) (WIP) | [`chapters/ru`](https://github.com/huggingface/course/tree/main/chapters/ru) | [@pdumin](https://github.com/pdumin), [@svv73](https://github.com/svv73) |
| [Russian](https://huggingface.co/course/ru/chapter1/1) (WIP) | [`chapters/ru`](https://github.com/huggingface/course/tree/main/chapters/ru) | [@pdumin](https://github.com/pdumin), [@svv73](https://github.com/svv73), [@blademoon](https://github.com/blademoon) |
| [Thai](https://huggingface.co/course/th/chapter1/1) (WIP) | [`chapters/th`](https://github.com/huggingface/course/tree/main/chapters/th) | [@peeraponw](https://github.com/peeraponw), [@a-krirk](https://github.com/a-krirk), [@jomariya23156](https://github.com/jomariya23156), [@ckingkan](https://github.com/ckingkan) |
| [Turkish](https://huggingface.co/course/tr/chapter1/1) (WIP) | [`chapters/tr`](https://github.com/huggingface/course/tree/main/chapters/tr) | [@tanersekmen](https://github.com/tanersekmen), [@mertbozkir](https://github.com/mertbozkir), [@ftarlaci](https://github.com/ftarlaci), [@akkasayaz](https://github.com/akkasayaz) |
| [Vietnamese](https://huggingface.co/course/vi/chapter1/1) | [`chapters/vi`](https://github.com/huggingface/course/tree/main/chapters/vi) | [@honghanhh](https://github.com/honghanhh) |
Expand Down
2 changes: 1 addition & 1 deletion chapters/de/chapter1/3.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ Wenn du einen Text an eine Pipeline übergibst, gibt es drei wichtige Schritte:
3. Die Vorhersagen des Modells werden so nachverarbeitet, sodass du sie nutzen kannst.


Einige der derzeit [verfügbaren Pipelines](https://huggingface.co/transformers/main_classes/pipelines.html) sind:
Einige der derzeit [verfügbaren Pipelines](https://huggingface.co/transformers/main_classes/pipelines) sind:

- `feature-extraction` (Vektordarstellung eines Textes erhalten)
- `fill-mask`
Expand Down
10 changes: 5 additions & 5 deletions chapters/de/chapter1/5.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@ Rein Encoder-basierte Modelle eignen sich am besten für Aufgaben, die ein Verst

Zu dieser Modellfamilie gehören unter anderem:

- [ALBERT](https://huggingface.co/transformers/model_doc/albert.html)
- [BERT](https://huggingface.co/transformers/model_doc/bert.html)
- [DistilBERT](https://huggingface.co/transformers/model_doc/distilbert.html)
- [ELECTRA](https://huggingface.co/transformers/model_doc/electra.html)
- [RoBERTa](https://huggingface.co/transformers/model_doc/roberta.html)
- [ALBERT](https://huggingface.co/transformers/model_doc/albert)
- [BERT](https://huggingface.co/transformers/model_doc/bert)
- [DistilBERT](https://huggingface.co/transformers/model_doc/distilbert)
- [ELECTRA](https://huggingface.co/transformers/model_doc/electra)
- [RoBERTa](https://huggingface.co/transformers/model_doc/roberta)
6 changes: 3 additions & 3 deletions chapters/de/chapter1/6.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Diese Modelle sind am besten für Aufgaben geeignet, bei denen es um die Generie

Zu dieser Modellfamilie gehören unter anderem:

- [CTRL](https://huggingface.co/transformers/model_doc/ctrl.html)
- [CTRL](https://huggingface.co/transformers/model_doc/ctrl)
- [GPT](https://huggingface.co/docs/transformers/model_doc/openai-gpt)
- [GPT-2](https://huggingface.co/transformers/model_doc/gpt2.html)
- [Transformer XL](https://huggingface.co/transformers/model_doc/transformerxl.html)
- [GPT-2](https://huggingface.co/transformers/model_doc/gpt2)
- [Transformer XL](https://huggingface.co/transformers/model_doc/transformerxl)
8 changes: 4 additions & 4 deletions chapters/de/chapter1/7.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Sequence-to-Sequence-Modelle eignen sich am besten für Aufgaben, bei denen es d

Vertreter dieser Modellfamilie sind u. a.:

- [BART](https://huggingface.co/transformers/model_doc/bart.html)
- [mBART](https://huggingface.co/transformers/model_doc/mbart.html)
- [Marian](https://huggingface.co/transformers/model_doc/marian.html)
- [T5](https://huggingface.co/transformers/model_doc/t5.html)
- [BART](https://huggingface.co/transformers/model_doc/bart)
- [mBART](https://huggingface.co/transformers/model_doc/mbart)
- [Marian](https://huggingface.co/transformers/model_doc/marian)
- [T5](https://huggingface.co/transformers/model_doc/t5)
2 changes: 1 addition & 1 deletion chapters/de/chapter3/2.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -235,7 +235,7 @@ tokenized_dataset = tokenizer(

Das funktioniert gut, hat aber den Nachteil, dass ein Dictionary zurückgegeben wird (mit unseren Schlüsselwörtern `input_ids`, `attention_mask` und `token_type_ids` und Werten aus Listen von Listen). Es funktioniert auch nur, wenn du genügend RAM hast, um den gesamten Datensatz während der Tokenisierung zu im RAM zwischen zu speichern (während die Datensätze aus der Bibliothek 🤗 Datasets [Apache Arrow](https://arrow.apache.org/) Dateien sind, die auf der Festplatte gespeichert sind, sodass nur die gewünschten Samples im RAM geladen sind).

Um die Daten als Datensatz zu speichern, verwenden wir die Methode [`Dataset.map()`](https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map). Dies gewährt uns zusätzliche Flexibilität, wenn wir zusätzliche Vorverarbeitung als nur die Tokenisierung benötigen. Die `map()`-Methode funktioniert, indem sie eine Funktion auf jedes Element des Datensatzes anwendet, also definieren wir eine Funktion, die unsere Inputs tokenisiert:
Um die Daten als Datensatz zu speichern, verwenden wir die Methode [`Dataset.map()`](https://huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.map). Dies gewährt uns zusätzliche Flexibilität, wenn wir zusätzliche Vorverarbeitung als nur die Tokenisierung benötigen. Die `map()`-Methode funktioniert, indem sie eine Funktion auf jedes Element des Datensatzes anwendet, also definieren wir eine Funktion, die unsere Inputs tokenisiert:

```py
def tokenize_function(example):
Expand Down
4 changes: 2 additions & 2 deletions chapters/de/chapter4/2.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ tokenizer = CamembertTokenizer.from_pretrained("camembert-base")
model = CamembertForMaskedLM.from_pretrained("camembert-base")
```

Dennoch empfehlen wir, dass man die [`Auto*` classes](https://huggingface.co/transformers/model_doc/auto.html?highlight=auto#auto-classes) stattdessen benutzt, da diese architekturunabhängig sind. Das vorherige Code-Beispiel gilt nur für Checkpoints, die in die CamemBERT Architektur zu laden sind, aber mit den `Auto*` Klassen kann man Checkpoints ziemlich einfach tauschen:
Dennoch empfehlen wir, dass man die [`Auto*` classes](https://huggingface.co/transformers/model_doc/auto?highlight=auto#auto-classes) stattdessen benutzt, da diese architekturunabhängig sind. Das vorherige Code-Beispiel gilt nur für Checkpoints, die in die CamemBERT Architektur zu laden sind, aber mit den `Auto*` Klassen kann man Checkpoints ziemlich einfach tauschen:

```py
from transformers import AutoTokenizer, AutoModelForMaskedLM
Expand All @@ -81,7 +81,7 @@ tokenizer = CamembertTokenizer.from_pretrained("camembert-base")
model = TFCamembertForMaskedLM.from_pretrained("camembert-base")
```

Hier empfehlen wir auch, dass man stattdessen die [`TFAuto*` classes](https://huggingface.co/transformers/model_doc/auto.html?highlight=auto#auto-classes) benutzt, da diese architekturunabhängig sind. Das vorherige Code-Beispiel gilt nur für Checkpoints, die in die CamemBERT Architektur zu laden sind, aber mit den `TFAuto*` Klassen kann man Checkpoints einfach tauschen:
Hier empfehlen wir auch, dass man stattdessen die [`TFAuto*` classes](https://huggingface.co/transformers/model_doc/auto?highlight=auto#auto-classes) benutzt, da diese architekturunabhängig sind. Das vorherige Code-Beispiel gilt nur für Checkpoints, die in die CamemBERT Architektur zu laden sind, aber mit den `TFAuto*` Klassen kann man Checkpoints einfach tauschen:

```py
from transformers import AutoTokenizer, TFAutoModelForMaskedLM
Expand Down
2 changes: 1 addition & 1 deletion chapters/en/chapter1/3.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ There are three main steps involved when you pass some text to a pipeline:
3. The predictions of the model are post-processed, so you can make sense of them.


Some of the currently [available pipelines](https://huggingface.co/transformers/main_classes/pipelines.html) are:
Some of the currently [available pipelines](https://huggingface.co/transformers/main_classes/pipelines) are:

- `feature-extraction` (get the vector representation of a text)
- `fill-mask`
Expand Down
6 changes: 3 additions & 3 deletions chapters/en/chapter1/6.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ These models are best suited for tasks involving text generation.

Representatives of this family of models include:

- [CTRL](https://huggingface.co/transformers/model_doc/ctrl.html)
- [CTRL](https://huggingface.co/transformers/model_doc/ctrl)
- [GPT](https://huggingface.co/docs/transformers/model_doc/openai-gpt)
- [GPT-2](https://huggingface.co/transformers/model_doc/gpt2.html)
- [Transformer XL](https://huggingface.co/transformers/model_doc/transfo-xl.html)
- [GPT-2](https://huggingface.co/transformers/model_doc/gpt2)
- [Transformer XL](https://huggingface.co/transformers/model_doc/transfo-xl)
8 changes: 4 additions & 4 deletions chapters/en/chapter1/7.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Sequence-to-sequence models are best suited for tasks revolving around generatin

Representatives of this family of models include:

- [BART](https://huggingface.co/transformers/model_doc/bart.html)
- [mBART](https://huggingface.co/transformers/model_doc/mbart.html)
- [Marian](https://huggingface.co/transformers/model_doc/marian.html)
- [T5](https://huggingface.co/transformers/model_doc/t5.html)
- [BART](https://huggingface.co/transformers/model_doc/bart)
- [mBART](https://huggingface.co/transformers/model_doc/mbart)
- [Marian](https://huggingface.co/transformers/model_doc/marian)
- [T5](https://huggingface.co/transformers/model_doc/t5)
2 changes: 1 addition & 1 deletion chapters/en/chapter3/2.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -235,7 +235,7 @@ tokenized_dataset = tokenizer(

This works well, but it has the disadvantage of returning a dictionary (with our keys, `input_ids`, `attention_mask`, and `token_type_ids`, and values that are lists of lists). It will also only work if you have enough RAM to store your whole dataset during the tokenization (whereas the datasets from the 🤗 Datasets library are [Apache Arrow](https://arrow.apache.org/) files stored on the disk, so you only keep the samples you ask for loaded in memory).

To keep the data as a dataset, we will use the [`Dataset.map()`](https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map) method. This also allows us some extra flexibility, if we need more preprocessing done than just tokenization. The `map()` method works by applying a function on each element of the dataset, so let's define a function that tokenizes our inputs:
To keep the data as a dataset, we will use the [`Dataset.map()`](https://huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.map) method. This also allows us some extra flexibility, if we need more preprocessing done than just tokenization. The `map()` method works by applying a function on each element of the dataset, so let's define a function that tokenizes our inputs:

```py
def tokenize_function(example):
Expand Down
4 changes: 2 additions & 2 deletions chapters/en/chapter4/2.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ tokenizer = CamembertTokenizer.from_pretrained("camembert-base")
model = CamembertForMaskedLM.from_pretrained("camembert-base")
```

However, we recommend using the [`Auto*` classes](https://huggingface.co/transformers/model_doc/auto.html?highlight=auto#auto-classes) instead, as these are by design architecture-agnostic. While the previous code sample limits users to checkpoints loadable in the CamemBERT architecture, using the `Auto*` classes makes switching checkpoints simple:
However, we recommend using the [`Auto*` classes](https://huggingface.co/transformers/model_doc/auto?highlight=auto#auto-classes) instead, as these are by design architecture-agnostic. While the previous code sample limits users to checkpoints loadable in the CamemBERT architecture, using the `Auto*` classes makes switching checkpoints simple:

```py
from transformers import AutoTokenizer, AutoModelForMaskedLM
Expand All @@ -81,7 +81,7 @@ tokenizer = CamembertTokenizer.from_pretrained("camembert-base")
model = TFCamembertForMaskedLM.from_pretrained("camembert-base")
```

However, we recommend using the [`TFAuto*` classes](https://huggingface.co/transformers/model_doc/auto.html?highlight=auto#auto-classes) instead, as these are by design architecture-agnostic. While the previous code sample limits users to checkpoints loadable in the CamemBERT architecture, using the `TFAuto*` classes makes switching checkpoints simple:
However, we recommend using the [`TFAuto*` classes](https://huggingface.co/transformers/model_doc/auto?highlight=auto#auto-classes) instead, as these are by design architecture-agnostic. While the previous code sample limits users to checkpoints loadable in the CamemBERT architecture, using the `TFAuto*` classes makes switching checkpoints simple:

```py
from transformers import AutoTokenizer, TFAutoModelForMaskedLM
Expand Down
2 changes: 1 addition & 1 deletion chapters/en/chapter4/3.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ Click on the "Files and versions" tab, and you should see the files visible in t

</Tip>

As you've seen, the `push_to_hub()` method accepts several arguments, making it possible to upload to a specific repository or organization namespace, or to use a different API token. We recommend you take a look at the method specification available directly in the [🤗 Transformers documentation](https://huggingface.co/transformers/model_sharing.html) to get an idea of what is possible.
As you've seen, the `push_to_hub()` method accepts several arguments, making it possible to upload to a specific repository or organization namespace, or to use a different API token. We recommend you take a look at the method specification available directly in the [🤗 Transformers documentation](https://huggingface.co/transformers/model_sharing) to get an idea of what is possible.

The `push_to_hub()` method is backed by the [`huggingface_hub`](https://github.com/huggingface/huggingface_hub) Python package, which offers a direct API to the Hugging Face Hub. It's integrated within 🤗 Transformers and several other machine learning libraries, like [`allenlp`](https://github.com/allenai/allennlp). Although we focus on the 🤗 Transformers integration in this chapter, integrating it into your own code or library is simple.

Expand Down
4 changes: 2 additions & 2 deletions chapters/en/chapter5/2.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ This is exactly what we wanted. Now, we can apply various preprocessing techniqu

<Tip>

The `data_files` argument of the `load_dataset()` function is quite flexible and can be either a single file path, a list of file paths, or a dictionary that maps split names to file paths. You can also glob files that match a specified pattern according to the rules used by the Unix shell (e.g., you can glob all the JSON files in a directory as a single split by setting `data_files="*.json"`). See the 🤗 Datasets [documentation](https://huggingface.co/docs/datasets/loading.html#local-and-remote-files) for more details.
The `data_files` argument of the `load_dataset()` function is quite flexible and can be either a single file path, a list of file paths, or a dictionary that maps split names to file paths. You can also glob files that match a specified pattern according to the rules used by the Unix shell (e.g., you can glob all the JSON files in a directory as a single split by setting `data_files="*.json"`). See the 🤗 Datasets [documentation](https://huggingface.co/docs/datasets/loading#local-and-remote-files) for more details.

</Tip>

Expand Down Expand Up @@ -160,7 +160,7 @@ This returns the same `DatasetDict` object obtained above, but saves us the step

<Tip>

✏️ **Try it out!** Pick another dataset hosted on GitHub or the [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/index.php) and try loading it both locally and remotely using the techniques introduced above. For bonus points, try loading a dataset that’s stored in a CSV or text format (see the [documentation](https://huggingface.co/docs/datasets/loading.html#local-and-remote-files) for more information on these formats).
✏️ **Try it out!** Pick another dataset hosted on GitHub or the [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/index.php) and try loading it both locally and remotely using the techniques introduced above. For bonus points, try loading a dataset that’s stored in a CSV or text format (see the [documentation](https://huggingface.co/docs/datasets/loading#local-and-remote-files) for more information on these formats).

</Tip>

Expand Down
Loading
Loading