Python - Replace last BPETokenizer occurences

This commit is contained in:
Anthony MOI
2020-02-18 16:25:59 -05:00
parent f263d7651f
commit 5daf1eea86
4 changed files with 8 additions and 8 deletions

View File

@ -42,7 +42,7 @@ Start using in a matter of seconds:
```python
# Tokenizers provides ultra-fast implementations of most current tokenizers:
>>> from tokenizers import (ByteLevelBPETokenizer,
BPETokenizer,
CharBPETokenizer,
SentencePieceBPETokenizer,
BertWordPieceTokenizer)
# Ultra-fast => they can encode 1GB of text in ~20sec on a standard server's CPU