mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-23 16:49:27 +00:00
[remove black
] And use ruff (#1436)
* nits * Fixing deps. * Ruff update. * Import order matters. * Fix. * Revert ruff fix. * Visualizer. * Putting back the imports. --------- Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com>
This commit is contained in:
@ -1,5 +1,3 @@
|
||||
import pytest
|
||||
|
||||
from tokenizers import BertWordPieceTokenizer
|
||||
|
||||
from ..utils import bert_files, data_dir, multiprocessing_with_parallelism
|
||||
|
Reference in New Issue
Block a user