The most recent PCI standards have introduced new complications for organizations handling digital payment systems. PCI Requirement 3 prohibits storing unsecured credit card information. Penalties for failing to do so can expose them to fines of $500,000 or more.

Meeting PCI Requirement 3 standards isn’t just important to avoid regulatory sanctions. Organizations that satisfy Requirement 3 are significantly less likely to experience security breaches. One survey found that only 32.7 percent of companies met Requirement 3 standards, but those firms only accounted for 18.2 percent of security breaches.

The reasons behind the discrepancy are difficult to pinpoint. Bolstering security to comply with Requirement 3 could make it more difficult for hackers to penetrate the system. Meeting Requirement 3 could also be a deterrent for would-be cybercriminals. The most likely explanation is a combination of the two.

However, many companies fail to meet PCI Requirement 3. New tokenization standards are facilitating compliance and minimizing the risks of data theft. It is important to understand the basic code and architecture behind these tokenization services. Here is a quick overview.

Overview of web tokenizers

In order to understand the benefits of tokenizers, it is necessary to understand traditional encryption protocols. Prior to the development of tokenization, SAP relied entirely on application encryption. Under application encryption, all data is encrypted or decrypted whenever it is sent or read from an SAP database.

Tokenization services eliminated the need for SAP to handle direct encryption. The tokenization script is stored on a third-party server and is responsible for all encryption.

‘Traditional encryption is outdated and unequipped to deliver the security solutions modern SAP platforms depend on,’ said Christine Lum, a senior analyst with FraudLabs Pro. ‘Meeting PCI compliance is infinitely easier with tokenization. I expect 80% of merchants subject to PCI requirements will use tokenization within the next five years.

An SAP tokenizer extracts a piece of code and separates it into multiple segments. This is most commonly done with credit card information. Paymetric reports that there are a number of benefits with this approach.

‘This approach offers significant benefits over traditional database encryption. First, exposure to unencrypted credit card numbers is removed from the SAP application and database at all times, which boosts security. Secondly, cryptographic processing is completely removed from SAP servers, which enhances application and database performance.’

Sensitive credit card information is converted into encrypted tokens, which are stored in the SAP. Inputs aren’t transferred directly to the field itself, so they cannot be easily decoded.

Web tokenizers are frequently coded in C, but can also be written in JavaScript and other C-based languages. Here is an excerpt of code from a Web tokenizer script provided by one of my colleagues on Github:

var oTokenizer1 = new sap.m.Tokenizer("editableTokenizer", { tokens: [ new sap.m.Token({text: "Token 1", key: "0001"}), new sap.m.Token({text: "Token 2", key: "0002"}), new sap.m.Token({text: "Token 3", key: "0003"}), new sap.m.Token({text: "Token 4 - long text example", key: "0004"}), new sap.m.Token({text: "Token 5", key: "0005"}), new sap.m.Token({text: "Token 6", key: "0006"}), new sap.m.Token({text: "Token 7", key: "0007"}), new sap.m.Token({text: "Token 8", key: "0008"}), new sap.m.Token({text: "Token 9 - ABCDEF", key: "0009"}), new sap.m.Token({text: "Token 10 - ABCDEFGHIKL", key: "0010"}), new sap.m.Token({text: "Token 11", key: "0011"}), new sap.m.Token({text: "Token 12", key: "0012"}) ], tokenChange: fEventWriter }); 

 

This script breaks input variables down into 12 tokens. It is possible to achieve greater levels of security and have a higher likelihood of meeting PCI Requirement 3 compliance by using 20 or more tokens. This may not be necessary though, because existing code offers sufficient security for most SAP based payment systems.

There are typically at least six tokenizer variables in a script. Some of these tokens are editable, while others have read-only permissions.

While the skeletral framework for the tokenization service is relatively straightforward, creating the infrastructure to handle it is much more complex. The organization would either need a Web Services Interface or RFC to handle the protocols. Most organizations lack the in-house expertise to properly deploy such a service.

Even if an organization can develop their own tokenization service, it is not recommended. Their environment may lack the homogeneity to provide adequate encryption. If the token encryption layers are too similar to other encryption code in the environment, then the risk of a security breach rises.

These problems can all be addressed by using a third-party tokenization service. While a formal, industry-wide audit has yet to confirm my speculations, I suspect that brands using third-party tokenization will be more likely to meet PCI Requirement 3 compliance.

Hilary Welter of Hardware Retailing explains that there are additional benefits of tokenization, such as improved customer convenience.

‘It’s easier for repeat customers to pay online or by phone if their payment information on file. But, retailers don’t want to store credit card data because it makes a business an attractive target for hackers and subjects the business to higher PCI standards. Business owners can use tokenization to store payment information securely at a third-party vendor. That way, a token is stored in the retailer’s system instead of sensitive card data. Tokenization can also be used in-store for customers to pay using a card on file,’ Welter states.