mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-22 16:25:30 +00:00
Typo in README.md
an -> a
This commit is contained in:
@ -59,7 +59,7 @@ Encoding(num_tokens=13, attributes=[ids, type_ids, tokens, offsets, attention_ma
|
||||
'😁'
|
||||
```
|
||||
|
||||
And training an new vocabulary is just as easy:
|
||||
And training a new vocabulary is just as easy:
|
||||
|
||||
```python
|
||||
# You can also train a BPE/Byte-levelBPE/WordPiece vocabulary on your own files
|
||||
|
Reference in New Issue
Block a user