Logo
Explore Help
Sign In
mii/tokenizers
1
0
Fork 0
You've already forked tokenizers
mirror of https://github.com/mii443/tokenizers.git synced 2025-08-25 01:29:23 +00:00
Code Issues Packages Projects Releases Wiki Activity
1,437 Commits 1 Branch 0 Tags
57200144ca2a5940a156d751ee37fe8fd1dcd85c
Commit Graph

4 Commits

Author SHA1 Message Date
Anthony MOI
d94fa220b6 Python - Add train_from_iterator to implementations 2021-01-07 09:02:20 -05:00
Nicolas Patry
98a30eead1 Temp work to make the APIs uniform (build from memory by default). 2020-09-24 08:57:02 +02:00
Anthony MOI
aa3b39f692 Python - Tests for parallelism with multiprocessing
Co-authored-by: Evan Pete Walsh <epwalsh10@gmail.com>
2020-06-23 11:25:39 -04:00
Anthony MOI
837791ee1f Python - Test BertWordPieceTokenizer 2020-04-01 17:25:56 -04:00
Powered by Gitea Version: 1.24.2 Page: 116ms Template: 4ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API