Subjectivity, Scrutiny, and Simpletons

Subjectivity, Scrutiny, and Simpletons

By CWNP On 02/09/2010 - 34 Comments

Every few months, we see press from vendors pointing to tests that show why they’re the best. Best throughput, best coverage, best capacity performance, best multimedia, best whatever. Sometimes the test is done by a contracted third-party, and other times we’re graced with simple marketing tests conducted by the “best” vendor itself. Before I go any further, let me be loud and clear that this article has nothing to do with the latest two or three tests that have hit the news. It is not aimed at any specific vendor, nor is it intended to say how a product should or shouldn’t be marketed, or even that a test is or isn’t reliable. This article is here to say how I think a competitive test should be conducted by a vendor, a biased (or unbiased) third-party, or otherwise, IMHO.

I have a few personal qualms with almost every competitive test I’ve ever seen for Wi-Fi. On the personal side, I’m a bit of a perfectionist, so I expect testers to do their due diligence to represent the most objective possible testing scenario. I doubt very seriously that objectivity is ever the top priority when money is involved. The problem is that vendors who do testing themselves are biased by definition, as if they could perform a test and release the data if they didn’t come out on top. So, they could readily run 10 different tests against the top 10 competing vendors and release only the data that shows them favorably, and only against those vendors with whom they compare favorably. Or worse, they could build non-commercial code for their own products, use unreliable or poor performing code on competitor products, or manipulate the test in many other ways that aren’t published. I’m not suggesting that this is done, just that it could be. Who would know?

Since vendors are smart, when they want to try to pass a performance review under the eye of vendor-neutral scrutiny, they hire a third-party to do the testing for them under the guise of objectivity. Again, the problem is the third-party was paid to do the test for that specific vendor. These tests aren’t as much sponsored neutral tests as they are paid third-party marketing. We often call these 'hired guns.'

Now, on the other hand, what third-party is going to go through extensive, exhausting testing to get verifiable data if there’s no money on tap? Even if they did, we all know that vendors who don’t fare well will shout 'malfeasance' (or perhaps something more normal) from the rooftop.

If you’ve ever read some classic philosophy, you might remember that Socrates was always poking holes in other philosophies and never positing a philosophy of his own. I don’t want to be a Socrates here, so here’s a test approach that would pass muster for me. There are a few requisite test reporting procedures that lend significant credibility to any test.

1. Equal representation. Each represented vendor must be well represented regarding their system. “Well represented” means that the vendor (or a reliable representative, like a qualified reseller) must select:

a) the best hardware for the given test scenario

b) the best software for the given test scenario

c) the best configuration for the given test scenario

These three pieces to the puzzle are important because they are the common points of contention after the results are in.Vendors are quick to point out that the wrong hardware was implemented, a less-than-best code rev. was used, or the configuration wasn’t applied properly.There are problems aplenty when these criteria are selected by the tester haphazardly.Test the best of the best from each vendor.

2. Full disclosure. In order for the test to hold water, all things must be disclosed to the scrutinous eye of the intended audience. This means that you publish precise testing methodology, including (at a minimum):

a) version and config of test software and hardware (e.g., Veriwave or Chariot)

b) exact scripts used for testing

c) vendor hardware, software versions (must be publicly available!)

d) client adapter types, driver, and utility software, etc.

3. Test validity must be demonstrable. I studied psychology (with an emphasis in psychometrics) in college, so we spent a lot of time criticizing test methods. I found a nice site that explains some testing basics: http://bit.ly/ac2xRQ. As quick summary,

a) The test must be repeatable (reliability)

b) The test must measure what it purports to measure (validity)

c) The test must be relevant

4. Public availability. A public forum should be provided to uphold test veracity. Surely vendors will come around after the results are published saying there was a problem with the test. If they can do so in a public forum in which good debate can occur, the claim can either be upheld and the test redone or the claim can be dismissed, given an intelligent discussion has occurred.

5. Audit. A neutral group (not beholden to a specific vendor, financially or otherwise) should perform the test, and a trusted auditor should validate the test.

I’ll be the first to admit that these five criteria are stiff and difficult to deliver. No less, until these things are provided along with a test, I’m not particularly comfortable believing what I see. Call me a cynic, a deconstructionist, or whatever else is fitting, but my perfectionism makes me think that a high-quality and objective comparison can and should be done if the data is to be honored. I may be way over the top here…after all, competitive testing is a marketing effort. No less, I’ll stick to my guns and remain staunchly critical in hopes that a higher standard for competitive testing is embraced. Until then, subjective tests will be believed by simpletons lacking scrutiny, and I though perhaps still a simpleton will not be among them.


Blog Disclaimer: The opinions expressed within these blog posts are solely the author’s and do not reflect the opinions and beliefs of the Certitrek, CWNP or its affiliates.


0 Responses to Subjectivity, Scrutiny, and Simpletons

Subscribe by Email
There are no comments yet.
<< prev - comments page 1 of 1 - next >>

Leave a Reply

Please login or sign-up to add your comment.
Success Stories

I literally just came out of the testing centre having taken the CWDP exam. The certification process opened my mind to different techniques and solutions. This knowledge can only broaden your perspective. Great job, CWNP, you have a great thing going on here.

-Darren
Read More

Working through the CWNP coursework and certifications helped not only to deepen my technical knowledge and understanding, but also it boosted my confidence. The hard work it took to earn my CWNE has been rewarding in so many ways.

-Ben
Read More

I want to commend you and all at CWNP for having a great organization. You really 'raise the bar' on knowing Wi-Fi well. I have learned a ton of information that is helping my job experience and personal career goals, because of my CWAP/CWDP/CWSP studies. Kudos to all at CWNP.

-Glenn
Read More