The PCI Europe Community meeting was set in the beautiful Marriott in Old Town Prague last week, and even though there were fewer attendees than the meeting in Vegas, there was no shortness of intensity and well researched questions.

One individual asked about the use of Data Discovery tools as a mandate to assist in the scoping of PCI assessments.  Imagine as a QSA walking into a customer, running a tool, and knowing EXACTLY the scope of the PCI assessment you need to perform!  There would be little chance that you under- or over-scoped it, and all those little nooks and crannies that scare the bejeebus out of a QSA would be documented right there for review.

escargot / snail, by OliBac

escargot / snail, by OliBac

If you are the entity being assessed, imagine being able to definitively scope out sections of your network by proving to your QSA that cardholder data does not exist there!   No more endless expeditions down rabbit holes by your third party assessor.  Your security team will love it!

Regardless, there was interesting discussion by the TWG on this topic.  Some forensic investigators are using data discovery tools after breaches to understand how far cardholder data propagates through an organization’s infrastructure, then using that data to perform their investigation more quickly.  My guess is that these are home grown or custom applications of existing commercial products that examiners will use.  My experience says that very few products are up to the task of operating solely as a data discovery tool (and not as a component of a full DLP suite).

In order to know how to apply PCI DSS to your environment, you have to know where the data lives and where it goes.

The issue with most data discovery tools lies in the platforms they are written in, and the poor state of most IT environments that have evolved out of necessity and lack of budget, not out of forethought.  Only now have I seen more and more companies looking to ITIL or standardized methods for administering large populations of workstations that are largely homogeneous.  If you have a fairly consistent build and software process, then just find a tool that will work in your environment, then go.

Unix can be scripted, just get fancy with grep.  Operating systems supporting Java may be able to take short cuts and use existing java-based data discovery tools.  But moving outside of that homogeneous environment can cause major software headaches that lead you to drink.

Should the Council mandate their use?  Today, no way.  The technology is not nearly mature enough to have wide applicability across the entire spectrum of users that must comply with PCI DSS.  Once it is mature, mandating data discovery tools would be an excellent mandate from the Council as it would give assessors and their customers a chance to correctly scope the environment, and avoid embarrassing breaches to unknown (or poorly documented) data stores.

http://en.wikipedia.org/wiki/Information_Technology_Infrastructure_Library

This post originally appeared on BrandenWilliams.com.

Possibly Related Posts: