Remove Tokenizer::normalize

This is actually a legacy function that doesn't really make sense now, and is getting really difficult to keep. So we remove it.
This commit is contained in:
Anthony MOI
2020-08-18 17:52:25 -04:00
committed by Anthony MOI
parent 18e3799b1d
commit 504d8c85d8
8 changed files with 25 additions and 104 deletions

View File

@@ -459,17 +459,6 @@ class Tokenizer:
if the padding is enabled.
"""
pass
def normalize(self, sequence: str) -> str:
""" Normalize the given sequence
Args:
sequence: str:
The sequence to normalize
Returns:
The normalized string
"""
pass
def encode(
self,
sequence: InputSequence,