Files
tokenizers/docs
Patrick von Platen dd399d2ad0 Split Pre-Tokenizer (#542)
* start playing around

* make a first version

* refactor

* apply make format

* add python bindings

* add some python binding tests

* correct pre-tokenizers

* update auto-generated bindings

* lint python bindings

* add code node

* add split to docs

* refactor python binding a bit

* cargo fmt

* clippy and fmt in node

* quick updates and fixes

* Oops

* Update node typings

* Update changelog

Co-authored-by: Anthony MOI <m.anthony.moi@gmail.com>
2020-11-27 17:07:03 -05:00
..
2020-11-27 17:07:03 -05:00
2020-11-02 17:07:27 -05:00
2020-11-02 17:07:27 -05:00

Requirements

In order to generate the documentation, it is necessary to have a Python environment with the following:

pip install sphinx sphinx_rtd_theme setuptools_rust

It is also necessary to have the tokenizers library in this same environment, for Sphinx to generate all the API Reference and links properly. If you want to visualize the documentation with some modifications made to the Python bindings, make sure you build it from source.

Building the documentation

Once everything is setup, you can build the documentation automatically for all the languages using the following command in the /docs folder:

make html_all

If you want to build only for a specific language, you can use:

make html O="-t python"

(Replacing python by the target language among rust, node, and python)

NOTE

If you are making any structural change to the documentation, it is recommended to clean the build directory before rebuilding:

make clean && make html_all