mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-22 16:25:30 +00:00
Update the CHANGELOG.
This commit is contained in:
@ -4,6 +4,10 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [0.11.5]
|
||||
|
||||
- [#895] Build `python 3.10` wheels.
|
||||
|
||||
## [0.11.4]
|
||||
|
||||
- [#884] Fixing bad deserialization following inclusion of a default for Punctuation
|
||||
@ -351,6 +355,7 @@ delimiter (Works like `.split(delimiter)`)
|
||||
- Fix a bug that was causing crashes in Python 3.5
|
||||
|
||||
|
||||
[#895]: https://github.com/huggingface/tokenizers/pull/895
|
||||
[#884]: https://github.com/huggingface/tokenizers/pull/884
|
||||
[#882]: https://github.com/huggingface/tokenizers/pull/882
|
||||
[#868]: https://github.com/huggingface/tokenizers/pull/868
|
||||
|
Reference in New Issue
Block a user