Data Tokenization
Data Tokenization is a process by which sensitive data is replaced by non-sensitive characters known as a token. The token corresponds to the sensitive data through an external data tokenization system. Data can be tokenized and de-tokenized as often as needed with approved access to the system.