tokenization-securing-data-random-numbers.jpg

Tokenization: Securing data with random numbers

Tokenization: Securing data with random numbers

With more people using cards for purchases, added security is a must for the card issuers. Tokenization is an innovative system that makes data in credit and debit cards more secure. This new method ensures peace of mind to both the cardholder and the card issuer.


Random number security

Tokenization is used to substitute the sensitive data on your card into a string of random numbers – the token – that can only be used by the system that created it. Tokens are simply reference codes that link to a secure point-to-point encryption (P2PE) transaction, usually a card payment. The system to turn your data into a token uses secure methods which make the token useless to anyone without the system that created it. The system for tokenization needs to be secure for storage and encryption/decryption of sensitive data.

When a token is used to replace sensitive data it reduces the exposure of data to prevent accidental compromise or unauthorized access. Merchant terminals and retail card readers can be used with the token, instead of using the actual card data, to process a refund for a customer, or to research transactions in cases of fraud.

The main use of tokenization is for credit card transactions through a merchant terminal or card reader. The security of the token relies heavily on the impossibility of anyone being able to decrypt it to read the original sensitive data. Tokenization is also used in other secure systems for things like bank accounts, medical records, voter registration and other personally identifiable information (PII).

The original concept of tokens

Tokens have been used to replace money for centuries. Using a token reduces the risk of fraud and theft by removing the currency when handling high value transactions. Notable examples are casino chips, fairground ride tokens, subway tokens and promissory notes from financial institutions. Such substitution methods have been used in the financial market since the 1970s to prevent data from being used fraudulently.

The first use of tokens in the payment card industry was developed and released by the Shift4 Corporation during the 2005 Security Summit in Nevada, USA. It was designed to prevent the theft of PII and other sensitive data in storage. Tokenization is defined as: “the concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In payment card industry (PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.”

Combined security makes data safer

By combining the different security methods when making card transactions, cardholders and issuers can be assured of the safest and most secure transaction available. To ensure full data protection, tokenization is combined with both P2PE and end-to-end encryption (E2EE) which ensures the secure transmission of the data to the tokenization system.

When used correctly with the required systems in place, tokenization can make it increasingly more difficult for an attacker to get access to the sensitive data. This may make the requirements of the Payment Card Industry Data Security Standard (PCI DSS) simpler, as there would be fewer systems storing the sensitive data in the original form.

The PCI Council is in full support of using tokenization, in conjunction with P2PE and E2EE, to reduce the risk of a breach of sensitive data.

Related articles published in Cardholders Data security :

Source : https://en.wikipedia.org/wiki/Tokenization_(data_security)