Streamlined Tokenization and Dynamic Data Masking

Now you can hide sensitive data in plain sight

Tokenization & Data Masking

Vormetric Tokenization with Dynamic Data Masking

Vormetric Tokenization with Dynamic Data Masking dramatically reduces the cost and effort required to comply with security policies and regulatory mandates like PCI DSS while also making it simple to protect other sensitive data including Personally Identifiable Information (PII). Dynamic Data Masking protects data in use while tokenization is protecting data at rest. You can efficiently address your objectives for securing and anonymizing sensitive assets—whether they reside in data center, big data, container or cloud environments. Beyond performing data tokenization, the Tokenization Server centralizes all tokenization configuration with a graphical user interface for creating templates for both tokenization and data masking. Simplicity results from the ability, with a few as just one line of code inserted into applications, to tokenize or detokenize with dynamic data masking.

Vormetric Tokenization with Dynamic Data Masking
Efficiently Reduce PCI DSS Compliance Scope

Remove card holder data from PCI DSS scope with minimal cost and effort and save big on complying with the industry standard with Vormetric Tokenization with Dynamic Data Masking.

Conveniently Protect Personally Identifiable Information

Modern IT architectures require both use and protection of personally identifiable information (PII). Tokenization with Dynamic Data Masking enables both with one line of code for protection and just one more for dynamically masked use of PII. Even more, protection is gained without any encryption key management required by the software developer!

Foster Innovation Without Introducing Risk

Tokenize data and maintain control and compliance when moving data to the cloud, or big data environments.

Scale Globally

Deploy the Vormetric Tokenization Server globally without concerns about token synchronization or performance. Server clustering enables easy scalability.

Tokenization Choices

Vormetric Tokenization combines the scalability and availability benefits of a vaultless solution with business-centric options for protecting data: both format-preserving and random tokenization. Format-preserving tokenization enables data protection without changing database schemas and offers irreversible tokens. Random tokenization offers high performance, convenient data protection. Date-specific tokenization supporting the full range of international date formats helps ensure PII and transaction security.

Dynamic Data Masking

Administrators can establish policies to return an entire field tokenized or dynamically mask parts of a field. For example, a security team could establish policies so that a user with customer service representative credentials would only receive a credit card number with the last four digits visible, while a customer service supervisor could access the full credit card number in the clear. Looking for static data masking? Vormetric Tokenization offers static data masking, but, for the broad range of static data masking use cases, consider Vormetric Batch Data Transformation.

Multi-tenancy Support

The Tokenization Server supports multi-tenancy with Tokenization groups. Tokenization groups ensure that data tokenized by one group cannot be detokenized by another, and are centrally managed.

Centralized Tokenization Templates

At the core of the programming simplicity of Vormetric Tokenization is the tokenization template.

Tokenization & Data Masking

The centralization of tokenization configuration enables a tokenization request to contain, simply, the tokenization group name, template name and the data to tokenize (along with username and password and the URL of the Tokenization Server). From there, all tokenization work is performed centrally on behalf of the software engineer.

Simple, Non-Disruptive Implementation

Tokenization mechanisms, methods and dynamic data masking rules are defined in a centralized, friendly graphical user interface (GUI). This dramatically reduces programming required for data protection. In addition, a range of format-preserving tokenization mechanisms are available to reduce requirements for changing the database schema. The Tokenization Server’s virtual appliance form factor enables fast scaling.

Tokenization Server Dashboard

Once deployed, the Vormetric Tokenization Server becomes a mission-critical part of the data security infrastructure. In support of that, the server presents an information-rich dashboard upon login, showing users and use of the server with exportable data.

Tokenization & Data Masking

Tokenization Server Dashboard

Tokenization capabilities:

Alphanumeric format preserving (FF1/FF3) or random tokenization up to 128KB, Date tokenization

Dynamic data masking capabilities:

Alpha/numeric, custom mask character

Data validation:

Luhn check

Deployment options:

Open Virtualization Format (.ova), International Organization for Standardization (.iso), Microsoft Hyper-V, Microsoft Azure Marketplace, Amazon Machine Image (.ami), Google Cloud Platform

Application integration:


Authentication integration:

Lightweight Directory Access Protocol (LDAP); Active Directory (AD); Client Certificate; OAuth2

  • Over 1 million tokenization transactions per second per tokenization server
  • Clustering for redundant, geographically dispersed, or scale-up tokenization servers

Solution Brief : Vormetric Tokenization with Dynamic Data Masking

Tokenization and data masking – anonymizing data for security and compliance. The Vormetric Data Security Platform features tokenization capabilities that can dramatically reduce the cost and effort associated with complying with security policies and regulatory mandates like the Payment Card Industry Data Security Standard (PCI DSS).


Research and WhitePaper : Vormetric Tokenization

For too many IT organizations, complying with the Payment Card Industry Data Security Standard (PCI DSS) and corporate security policies has been far too costly, complex, and time consuming. Now, Thales eSecurity offers a better way. Vormetric Tokenization with Dynamic Data Masking helps your security team address its compliance objectives while gaining breakthroughs in operational efficiency.


Research and WhitePaper : Evaluation of the Thales eSecurity Token Server

Fortrex Qualified Security Assessor (QSA) evaluated the Thales eSecurity Token Server, and determined when properly implemented and configured within a secured cardholder environment, it can reduce the scope of the systems included in the scope of a PCI DSS assessment. They also qualified that the solution can be leveraged to tokenize other sensitive data within a corporate environment. Fortrex detailed their evaluation process in their white paper, Evaluation of the Thales eSecurity Token Server.


White Paper : Complying with PCI DSS 3.0 Encryption Rules

This white paper outlines how to use Vormetric Transparent Encryption to meet PCI DSS 3.0 Requirements with Data-at-Rest Encryption, Access Control and Data Access Audit Logs in traditional server, virtual, cloud and big data environments. The paper maps PCI DSS requirements 3, 7, 8, 9 and 10 that can be addressed with Vormetric Transparent Encryption.

Посмотрите интерактивное демо Подробнее
Записаться на демо Записаться
Свяжитесь со специалистом Свяжитесь с нами