We use 19.10b0 not 20 here...

This commit is contained in:
Nicolas Patry
2020-09-23 11:58:15 +02:00
parent 4705fa8a00
commit 9672995a56
9 changed files with 16 additions and 50 deletions

View File

@ -32,10 +32,7 @@ if not files:
# Initialize an empty tokenizer
tokenizer = BertWordPieceTokenizer(
clean_text=True,
handle_chinese_chars=True,
strip_accents=True,
lowercase=True,
clean_text=True, handle_chinese_chars=True, strip_accents=True, lowercase=True,
)
# And then train