Logo
Explore Help
Sign In
mii/tokenizers
1
0
Fork 0
You've already forked tokenizers
mirror of https://github.com/mii443/tokenizers.git synced 2025-12-07 13:18:31 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
2a84ef12cf877f40a2e2ef9f39b11625c62124af
tokenizers/bindings/python/tokenizers
History
Anthony MOI 2a84ef12cf Python - Add missing get_vocab from BaseTokenizer
2020-04-01 11:32:54 -04:00
..
decoders
Python - Bindings for Wordpiece decoder's cleanup
2020-02-13 17:50:37 -05:00
implementations
Python - Add missing get_vocab from BaseTokenizer
2020-04-01 11:32:54 -04:00
models
TokenizedSequence / TokenizedSequenceWithOffsets needs to be declared in .py files not only .pyi
2020-03-26 15:42:45 -04:00
normalizers
Python - Update bindings
2020-03-09 18:37:03 -04:00
pre_tokenizers
Python - Update bindings
2020-03-09 18:37:03 -04:00
processors
Keep default values as true
2020-03-10 12:58:53 -04:00
trainers
Python - Add bindings for new AddedToken options
2020-03-24 20:58:45 -04:00
__init__.py
Bump version for Python release
2020-03-31 14:25:47 -04:00
__init__.pyi
Python - expost get_vocab on Tokenizer
2020-03-27 11:53:18 -04:00
Powered by Gitea Version: 1.24.5 Page: 172ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API