mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-22 16:25:30 +00:00
Update README.md - Broken link (#1272)
* Update README.md - Broken link fixed "python documentation" link * Update README.md Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> --------- Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com> Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
This commit is contained in:
@ -70,6 +70,7 @@ print(output.tokens)
|
||||
# ["Hello", ",", "y", "'", "all", "!", "How", "are", "you", "[UNK]", "?"]
|
||||
```
|
||||
|
||||
Check the [python documentation](https://huggingface.co/docs/tokenizers/python/latest) or the
|
||||
Check the [python documentation](https://huggingface.co/docs/tokenizers/index) or the
|
||||
|
||||
[python quicktour](https://huggingface.co/docs/tokenizers/python/latest/quicktour.html) to learn
|
||||
more!
|
||||
|
Reference in New Issue
Block a user