mirror of
https://github.com/mii443/tokenizers.git
synced 2025-12-07 13:18:31 +00:00
Rust & Python - Update CHANGELOGs
This commit is contained in:
@@ -3,6 +3,7 @@
|
||||
Fixes:
|
||||
- Some default tokens were missing from `BertWordPieceTokenizer` (cf [#160](https://github.com/huggingface/tokenizers/issues/160))
|
||||
# v0.5.2
|
||||
- Do not open all files directly while training ([#163](https://github.com/huggingface/tokenizers/issues/163))
|
||||
|
||||
## Fixes:
|
||||
- We introduced a bug related to the saving of the WordPiece model in 0.5.2: The `vocab.txt` file was named
|
||||
|
||||
Reference in New Issue
Block a user