bn:03450085n
Noun Concept
EL
No term available
EN
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. Wikipedia
Relations
Sources