Logo
Explore Help
Sign In
mii/tokenizers
1
0
Fork 0
You've already forked tokenizers
mirror of https://github.com/mii443/tokenizers.git synced 2025-12-03 19:28:20 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
1a6f4b52042eec72d008d80187fb66c107a8978d
tokenizers/bindings/python/py_src/tokenizers
History
Anthony MOI 1a6f4b5204 Allow initial_alphabet on UnigramTrainer
2020-10-26 10:57:29 -04:00
..
decoders
Black pre-commit after rebase.
2020-09-24 08:57:02 +02:00
implementations
Fixed BPE.read_files -> BPE.read_file in SentencePieceBPETokenizer
2020-10-26 10:57:14 -04:00
models
Upgrading to black 20.8b1
2020-09-24 09:27:30 -04:00
normalizers
Black pre-commit after rebase.
2020-09-24 08:57:02 +02:00
pre_tokenizers
Finish exposing the UnicodeScripts PreTokenizer
2020-10-21 11:01:54 -04:00
processors
Update __init__.pyi
2020-09-29 10:09:10 -04:00
trainers
Allow initial_alphabet on UnigramTrainer
2020-10-26 10:57:29 -04:00
__init__.py
Python - Update CHANGELOG and bump to 0.9.2 for release
2020-10-15 10:14:58 -04:00
__init__.pyi
Upgrading to black 20.8b1
2020-09-24 09:27:30 -04:00
Powered by Gitea Version: 1.24.5 Page: 175ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API