bn:17765289n
Noun Concept
SYL
No term available
EN
Lexical tokenization is conversion of a text into meaningful lexical tokens belonging to categories defined by a "lexer" program. Wikipedia
Relations
Sources
PRODUCT OR MATERIAL PRODUCED