Tokenization: Definitions Matter in the Fight to Keep Payment Data Secure
July 18, 2016
Tokenization technology was introduced to the payments industry in 2005. It stems from the concept of arcade tokens, which only work within their designated arcades, as opposed to a quarter, which has universal value and can be used anywhere. Like the arcade token, tokenization exchanges a credit or debit card number for a token that has value only within specific parameters and locations. This helps secure post-authorization card data for long-term storage, which was the most vulnerable point for payment data to be compromised at the time tokenization was introduced.
Adoption was rapid and far-reaching, to the point that tokenization is now considered a staple in the payments industry. Despite tokenization’s ubiquity, there has not been a standard established that mandates how tokenization is deployed — let alone what defines a token itself. EMVCo (the body that manages and maintains EMV specifications) and the Payment Card Industry Security Standards Council have both attempted to standardize tokenization over the past few years, but neither has gotten far enough to even define what the term should mean.
Definitions Are Important
Today, the word tokenization is being used far too broadly to describe a variety of payment security methods that perform different functions. “EMVCo tokenization” has become a source for much debate in the payments industry. It can refer to both consumer-based mobile payment tokenization (à la Apple (News - Alert) Pay) as well as card-based tokenization used by merchants. I even saw one article recently that discussed the “benefits” of point-to-point encryption as a tokenization solution — this comparison borders on blasphemy for those of us who know what tokenization truly is and the specific problem that it solves.
It goes beyond the question of semantics; such confusion is a threat to merchants because it may lead them astray from the very tokenization solutions they need to secure their business. Point-to-point encryption as a token? Absolutely not. Tokenization was specifically designed not to be encrypted data, because by definition, encrypted data is potentially decryptable.
The Difference Is Significant
To prevent data breaches, tokenization generates a globally unique, random, alphanumeric value that replaces payment card data after bank authorization so the data stored in merchant systems has zero value outside of their environment. Tokenization works differently than encryption because each individual token is organically random with no mathematical pattern to be unlocked. Tokens were designed to never maintain a one-to-one relationship with a card (although additional secure technologies were later built that allowed for tokenized merchants to still track card usage for analytics). This ensures that tokens aren’t predictable and cannot be reversed or decrypted. Also, tokenization was designed to be alphanumeric, meaning there are enough possible permutations that they will never be repeated within even the largest payment ecosystems.
Tokenization, by its original definition, requires that a token be created to reference a single card number only for a single transaction, but not linked to the card as a constant. This differs from recent discussions that mistake tokenization for security features driven by mobile wallets and credit or debit cards. Even though they are referred to as tokenization, these services aren’t truly tokenization at all. Instead, they are consumer-based token services that seek to protect the cardholder — not the merchant. This is a noble undertaking, but slightly misguided, since having a token that references the same universally-accepted card number has done nothing more than create a new card number that is just as vulnerable to attack as the original data; this is not what tokenization was designed to do.
Why True Tokenization Matters
Business needs dictate that some merchants store transactional information after the initial payment is processed to allow for returns, incremental authorizations, recurring billing, etc. For example, hotels typically store customer data from the time an initial reservation is made until after the final checkout. Before tokenization, this meant keeping hundreds — if not thousands — of card numbers on file, which led to a higher risk of experiencing a breach. Tokenization was created to protect merchants in these scenarios and keep their sensitive data out of the reach of hackers. And it proved that sensitive, vulnerable card data doesn’t actually need to be stored, even in card-on-file environments.
Using a consumer-based token model that any retailer can accept gives their token universal value — and therefore universal risk. If one of these consumer-tokenization providers released their full list of tokens tomorrow, you can bet there would be an instant increase of fraud among merchants that accept them. Conversely, if a comprehensive list was released that included the billions of transactions tokenized according to its definition; hackers would be no closer to breaching a merchant’s systems.
The consumer-based tokens that Apple, PayPal, Samsung (News - Alert) and other companies have been successfully assigning for years do offer a certain level of protection to cardholders at the point of purchase and have, thus far, been relatively effective in preventing mass-scale breaches. My point isn’t to tear down them down; my contention with these technologies is simply that they should not be called tokenization. They are much closer to an encryption or cryptographic hash than they are to the arcade token of old.
Fortunately, merchants aren’t forced to choose one or the other. These technologies can and do work together to accomplish greater security. Tokenization — according to the original definition — tokenizes the consumer tokens that are received from a mobile wallet or other payment instrument.
Today’s digital consumers appreciate the ease and convenience of being able to pay with their mobile wallet. It’s a boost to business, and with tokenization and today’s other security technologies, merchants can make sure that the technology they are using isn’t putting their customers’ card data at risk.
About the Author
J.D. Oder II serves as Shift4’s CTO and SVP – R&D. J.D. is a Certified Network Engineer with more than 15 years of experience. He leads Shift4’s systems operations and development efforts as well as the security and compliance teams. J.D. is the architect of the DOLLARS ON (News - Alert) THE NET® payment gateway solution. He is credited with introducing tokenization to the industry in 2005 and was also an early adopter/member of the PCI (News - Alert) Security Standards Council.
Edited by Peter Bernstein
Article comments powered by