mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-23 00:35:35 +00:00
fix filelink (#1610)
This commit is contained in:
@ -28,7 +28,7 @@ versatility.
|
|||||||
- Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.
|
- Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.
|
||||||
|
|
||||||
## Performances
|
## Performances
|
||||||
Performances can vary depending on hardware, but running the [~/bindings/python/benches/test_tiktoken.py](https://github.com/huggingface/tokenizers/bindings/python/benches/test_tiktoken.py) should give the following on a g6 aws instance:
|
Performances can vary depending on hardware, but running the [~/bindings/python/benches/test_tiktoken.py](bindings/python/benches/test_tiktoken.py) should give the following on a g6 aws instance:
|
||||||

|

|
||||||
|
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user