mirror of
https://github.com/mii443/tokenizers.git
synced 2025-08-22 16:25:30 +00:00
Preparing for 0.11.0 Re-release. (#856)
* Starting from master again. Upgrade libssl everywhere on quay Extra is ubuntu based (running the quay in a container). making only extra run + attempt to fix ssl update. Extra with newer openssl versions. `-y`. Use checkoint@v2 + remove `-` from environment name. Debugging back the conda release.. Attempt to use `base` env. 3.7 requires `activate-environement: true. MacOS and windows don't run on manylinux. Remove yum on windows/macOs. Miniconda doesn't like manylinux2014 anymore ? Attempting different approach for manylinux + conda. Use wget. Extra bracet. Executing $filename Activate the env. Activate the env on eevery step that requires it. Openssl-devel. Activating env for extracting version ? Retest all workflows. Manylinux2010 requires checkout@v1 Run on tag for extra and conda again. openssl-devel. * Putting back into deploy state. * Adding links in CHANGELOG. * Remove clippy from changelog.
This commit is contained in:
@ -6,10 +6,18 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
## [0.11.0]
|
||||
|
||||
### Fixed
|
||||
|
||||
- [#585] Conda version should now work on old CentOS
|
||||
- [#844] Fixing interaction between `is_pretokenized` and `trim_offsets`.
|
||||
- [#851] Doc links
|
||||
|
||||
### Added
|
||||
- [#657]: Add SplitDelimiterBehavior customization to Punctuation constructor
|
||||
- [#845]: Documentation for `Decoders`.
|
||||
|
||||
### Changed
|
||||
- [#850]: Added a feature gate to enable disabling `http` features
|
||||
- [#718]: Fix `WordLevel` tokenizer determinism during training
|
||||
- [#762]: Add a way to specify the unknown token in `SentencePieceUnigramTokenizer`
|
||||
- [#770]: Improved documentation for `UnigramTrainer`
|
||||
@ -333,6 +341,11 @@ delimiter (Works like `.split(delimiter)`)
|
||||
- Fix a bug that was causing crashes in Python 3.5
|
||||
|
||||
|
||||
[#850]: https://github.com/huggingface/tokenizers/pull/850
|
||||
[#844]: https://github.com/huggingface/tokenizers/pull/844
|
||||
[#845]: https://github.com/huggingface/tokenizers/pull/845
|
||||
[#851]: https://github.com/huggingface/tokenizers/pull/851
|
||||
[#585]: https://github.com/huggingface/tokenizers/pull/585
|
||||
[#793]: https://github.com/huggingface/tokenizers/pull/793
|
||||
[#780]: https://github.com/huggingface/tokenizers/pull/780
|
||||
[#770]: https://github.com/huggingface/tokenizers/pull/770
|
||||
|
Reference in New Issue
Block a user