mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-22 16:25:30 +00:00
[MINOR:TYPO] Fix docstrings (#1653)
* [MINOR:TYPO] Update pre_tokenizers.rs * [MINOR:TYPO] Update __init__.pyi
This commit is contained in:
committed by
GitHub
parent
5e223ceb48
commit
57884ebaa2
@ -421,7 +421,7 @@ class Split(PreTokenizer):
|
||||
|
||||
Args:
|
||||
pattern (:obj:`str` or :class:`~tokenizers.Regex`):
|
||||
A pattern used to split the string. Usually a string or a a regex built with `tokenizers.Regex`.
|
||||
A pattern used to split the string. Usually a string or a regex built with `tokenizers.Regex`.
|
||||
If you want to use a regex pattern, it has to be wrapped around a `tokenizer.Regex`,
|
||||
otherwise we consider is as a string pattern. For example `pattern="|"`
|
||||
means you want to split on `|` (imagine a csv file for example), while
|
||||
|
@ -334,7 +334,7 @@ impl PyWhitespaceSplit {
|
||||
///
|
||||
/// Args:
|
||||
/// pattern (:obj:`str` or :class:`~tokenizers.Regex`):
|
||||
/// A pattern used to split the string. Usually a string or a a regex built with `tokenizers.Regex`.
|
||||
/// A pattern used to split the string. Usually a string or a regex built with `tokenizers.Regex`.
|
||||
/// If you want to use a regex pattern, it has to be wrapped around a `tokenizer.Regex`,
|
||||
/// otherwise we consider is as a string pattern. For example `pattern="|"`
|
||||
/// means you want to split on `|` (imagine a csv file for example), while
|
||||
|
Reference in New Issue
Block a user