mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-22 16:25:30 +00:00
Add @property docs
This commit is contained in:
@ -5,6 +5,11 @@
|
||||
## AddedToken[[tokenizers.AddedToken]]
|
||||
|
||||
[[autodoc]] tokenizers.AddedToken
|
||||
- content
|
||||
- lstrip
|
||||
- normalized
|
||||
- rstrip
|
||||
- single_word
|
||||
</python>
|
||||
<rust>
|
||||
The Rust API Reference is available directly on the [Docs.rs](https://docs.rs/tokenizers/latest/tokenizers/) website.
|
||||
|
@ -5,6 +5,18 @@
|
||||
## Encoding[[tokenizers.Encoding]]
|
||||
|
||||
[[autodoc]] tokenizers.Encoding
|
||||
- all
|
||||
- attention_mask
|
||||
- ids
|
||||
- n_sequences
|
||||
- offsets
|
||||
- overflowing
|
||||
- sequence_ids
|
||||
- special_tokens_mask
|
||||
- tokens
|
||||
- type_ids
|
||||
- word_ids
|
||||
- words
|
||||
</python>
|
||||
<rust>
|
||||
The Rust API Reference is available directly on the [Docs.rs](https://docs.rs/tokenizers/latest/tokenizers/) website.
|
||||
|
@ -5,6 +5,14 @@
|
||||
## Tokenizer[[tokenizers.Tokenizer]]
|
||||
|
||||
[[autodoc]] tokenizers.Tokenizer
|
||||
- all
|
||||
- decoder
|
||||
- model
|
||||
- normalizer
|
||||
- padding
|
||||
- post_processor
|
||||
- pre_tokenizer
|
||||
- truncation
|
||||
</python>
|
||||
<rust>
|
||||
The Rust API Reference is available directly on the [Docs.rs](https://docs.rs/tokenizers/latest/tokenizers/) website.
|
||||
|
Reference in New Issue
Block a user