Tokenization definition
Tokenization, in terms of data security, involves substituting sensitive data elements with a non-sensitive equivalent, also called a token. The token has no extrinsic or exploitable value, which reduces the risk of data breaches. Banks use tokenization frequently by replacing bank account numbers with tokens. The tokens can be used to represent the user in a database or to provide some sort of authentication during transactions. Tokenization can protect your privacy and reduce the risk of data breaches because it replaces permanent identity numbers or other PII with non-sensitive equivalents that have no value. So, the permanent numbers won’t be stored during transactions and will be kept safe.
See also: security token, data breach
Types of tokenization
- Front end tokenization. This type of tokenization happens when a user creates a token as part of an online service. Later on, the users can use the token in transactions instead of the original identifier value, like a bank account number. To do this, the users need to have the technical knowledge and skills to understand why the token is created and how it can be created.
- Back end tokenization. This type of tokenization happens when the token provider tokenizes identifiers before sharing them with other systems. For example, banks use this type of tokenization to provide users with tokens containing their signatures or an identifier that allows them to authorize transactions. This is done without user intervention, so the user doesn’t have to have any technical skills or knowledge to use it.