bn:13890904n
Noun Concept
Categories: Tasks of natural language processing, Computing stubs
EN
tokenization  tokenisation
EN
In lexical analysis, tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens. Wikipedia
Definitions
Relations
Sources
EN
In lexical analysis, tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens. Wikipedia
Wikidata
Wikidata Alias