Hey look, it’s the first of ten posts with a detailed analysis on a PCI Requirement! While this one isn’t specifically a numbered requirement, I do find that sampling is troubling. I’ve written about it before, and we used to have all kinds of fun in the assessment process with sampling. From the reader:

Sampling methodology. The QSA has to validate that the sampled infrastructure is compliant with the requirements. However, time cost the client money which they don’t want to pay. They always go with the lowest price / proposal. How can the QSA convince the client that the sampling methodology used is aligned with the RoC reporting instructions? How can one QSAC propose 30 days to complete a final assessment and the next QSAC only 20 days for the same environment?

Folderol_Follies_011, by basurablancaphoto

Sampling is something that has been debated ever since it first showed up in the CISP and SDP programs. In the old days, the training and documentation described something called selective sampling. I am sure you can find tons of definitions for this term, but the way I like to describe it is sample enough systems such that you have enough feel-goodery to call those representative of the population. There are many things that impact this including the economics of the assessment (lower price can mean fewer systems sampled), the personal experience of the assessor, and frankly, how they feel that particular day. I have seen some QSAs look at one or two systems and call it representative of a population of 1,000, some ignore the population all together, some that want to see a fixed percentage, and I’ve seen others that want to look at ten systems, on average, to reach their conclusion. Obviously this is flawed, and that was the subject of the above-linked blog post.

PCI DSS 2.0 says that there are three things each QSA must do to explain their methodology each time sampling is used. Those are:

  1. Describe the rationale behind technique used and sample size.
  2. Validate the standardized PCI DSS processes and controls used to determine sample size (More details and flowchart contained in Appendix D). Essentially this means that if the company has controls in place to ensure consistency across the population, the sample size can be smaller.
  3. Describe rationale for the sample size chosen and how it represents the larger population.

So now instead of requiring one particular kind of sampling, it’s up to the assessor to figure out which one he wants to use. This is one of many reasons why you could be seeing the variance in your quotes, but realize that some QSAs may have processes that make their assessment more efficient than others, thus passing those savings on to you.

But what if we wanted to actually borrow from the audit world and get something a bit more defensible? Statistically valid sampling has four elements that can be used to build a sample size based on a certain population. The size of the population clearly drives the sample size. Next the confidence level that the QSA wishes to have that the sample matches the population. Here’s where you can get into some really interesting numbers. Let’s say QSA 1 is OK with 90% and QSA 2 requires 95% confidence. There will be a significant difference in the sample size between those two. Finally, what is the margin of error the QSA is willing to accept? Combine all of these things together and you will end up with what your sample size should be. There are tables where you can simply look these values up and see the sample size, but the shocking thing you will find is how large those sample sizes become. 300 systems to sample in a population of 1,000 seems insurmountable in some cases! And as the populations get smaller, you are essentially reviewing upwards of 50%.

Bad-Boys, by davidsonscott15

You can see why the Council doesn’t require statistically valid samples, but with the levels of IT automation that companies are deploying in their infrastructure, it isn’t as big of an issue to have someone review a larger number of systems. Chances are, if you do PCI DSS you probably also have IT auditors around checking for compliance with other areas or controls. If this is the case, ask them what they use for their sampling methodology. You will learn a ton about the assumptions they use, and you could carry those same assumptions into your sampling for PCI DSS. Just don’t get shocked when you are sampling fifty systems instead of four.

How should a sample size be determined? Certainly it’s up to your QSA, but here’s what the Council says. First, if you have good IT controls and good IT automation (or similar equivalents in the physical world for those samples), you should choose a small sample and validate they are consistent and compliant. If at any time neither of those is true, the sample size will increase to either validate that there is a massive compliance issue or a massive consistency issue. Experienced QSAs who understand the underlying technologies driving the environment will be able to size things up efficiently, where new QSAs who don’t know a PIX from an ASA will need more time.

To more directly answer your question, how do you convince the client that the higher price is better? This is one of those free market issues that will always exist. The reality is merchants and service providers have pushed the price of PCI assessments down because of the economic environment coupled with the large supply of QSAs. Procurement departments solely purchasing on price will never understand anything more than “How much $$$ to get me the piece of paper to keep the fines off my back?” I’ve successfully sold around this numerous times as these types of services shouldn’t be done based on cost-leadership strategy. Good PCI assessments are differentiated, meaning you get lots of value for your money and there is plenty of opportunity for automation to reduce costs.

Thanks for the question! Hopefully this analysis served its purpose. If you have your own requirement you want analyzed, go to this post and add a comment!

This post originally appeared on BrandenWilliams.com.