Something +/- WhateverBy CWNP On 12/15/2008 - 15 Comments
Obviously the name gives you no context of the content, so I'll need to explain. I was a calibrator in the Army - that's right, a bonafide 35H that knew the Mouse Song cold. Don't worry, I won't sing it for you. If you're thinking, 'what's a calibrator?' this blog post is for you. A calibration specialist (the official title) is someone who verifies and documents the accuracy of test equipment. Occassionally (more than most calibrators would like) equipment won't calibrate properly and has to be repaired. There are multiple specialty areas within the calibration profession, such as 1) DC & Low Frequency, 2) Physical Dimensional, and 3) Radio Frequency. I think you can guess what my specialty was. It was forced upon me because our 'RF' guy was reassigned. Holy crap, the books doubled as chairs in a pinch. Scared me to death.
Spectrum analyzers, signal generators, frequency counters, power meters - you name it, I calibrated it. Of course, back then I had little to no idea what any of this stuff was used for and why those 10's and 3's were so important. I got into data networking for several years, and when I tripped over my first WLAN (did you catch that subtle play on words?), the puzzle pieces from the math-rich RF courses and the multiple years of being chained to a bench in RF calibration labs all snapped together at once. I found my calling: a marriage of RF and Data networking. Obviously the RF background has been tremendously useful to me, but I've never thought of the calibration part of it as being so useful...until now.
I was recently asked by a vendor to explain how we can know the accuracy of a Wi-Fi card's signal strength (RSSI) readings. If, for example, a Cisco 802.11a/g card's client utility says that it is reading -67 dBm, how do you know that it's accurate? Well, that's what calibration is all about. I was excited first to see that this vendor cared enough to engage such a significant issue, but even more, they wanted to get down into the details to a point where it became very useful to their customer base. Kudos. However, the question still remained,'How do we prove accuracy of radios?'
It all starts with a known-accurate reference. In the industry, it's called a 'standard.' You're likely familiar with that term, since the IEEE builds documents, called standards, around which equipment is built. Calibration standards are test equipment that is known to be at least 4 times more accurate than the unit under test (UUT). The standard is known to be accurate because it is also calibrated against something 4+ times more accurate than itself - also called a standard. The chain continues upwardly unbroken (we call this 'traceable') until you reach THE reference standard (whatever that might be for the equipment type that you're currently calibrating).
To test the accuracy of a Wi-Fi card for example, you 'could' do it like this:
1. Send your benchtop spectrum analyzer to a calibration lab to receive a manufacturer's specification calibration, which will ensure that the spectrum analyzer is as accurate as the manufacturer says that it is. Let's supposed that it has a +/- 1 dB tolerance for all measurements that it makes. It will come with a calibration report, showing all of the frequency/amplitude data points that were tested, and how much it was different from the standard against which it was tested. Let's suppose that it was -0.5 dB off at 2.437 GHz.
2. You put a minimal gain omni antenna on your spectrum analyzer, and tune it for 2.437 GHz. You have a fixed-amplitude signal source generate a 2.437 GHz signal at a given distance. You measure it with the spectrum analyzer and get a dBm value. That's now your reference value at that location.
3. You place your Wi-Fi card in exactly the same spot as the spectrum analyzer antenna and take a dBm reading. In this way, you know the difference between a known-good standard and your UUT. Suppose that the difference was significant. The next step would be to test another hopefully-identical UUT, but if the readings were approximately the same, you would know that, on average, that UUT type is off by a given number of dBm from a known-good reference. More or less, you're looking for a standard deviation among card types.
If this difference between Standard and UUT is significant, think how it can affect site surveying. If we need -65 dBm cell boundaries and our measurement software says it's -67 dBm, then we suppose that we're finished. What if the actual number is -80 dBm? That's 13 dB, which is 1/20th (5%) of the power that we're supposed to have. Just the other day we had a problem here in the lab with an AP not putting out nearly enough power for what its software said it was doing. It was horrible, but who's to say that it wasn't accurate? No one thus far. This shows the need for calibration.
Some relevant questions are:
1. How accurate are Wi-Fi radios? Considering the number of Wi-Fi radio manufacturers, obviously this can vary. 'How much' is anyone's guess at this point in time. I hope the chipset vendors give these specs to their customers.
2. Should infrastructure equipment vendors, client utility vendors, and special software vendors (like analyzers) take care of all of this 'accuracy stuff' for us? Yes. I don't know about you, but I don't want to go through a test procedure to verify the accuracy of my software, client radio, or AP before I use it each time. I just want the software/firmware to give me the correct values. I want the vendor to make whatever adjustment is necessary, and then give me the final dBm number.
3. Do you think that inaccuracies have caused poor implementations in the past and even still today? YES! Accuracy isn't relative - it's an absolute science. Obviously there are tremendous variables in play. Antenna types, multipath, free space path loss, object penetration, etc. That's why all of this has remained relative and the accuracy unknown.
What if you, as an administrator, had a laptop that indicates the power at a given location is -67 dBm, and a VoWiFi phone indicates that it is -80 dBm. How would you deduce which is correct? I would ask the manufacturers for documentation explaining how they know their hardware and software is accurate.
The reason for the title is that in talking to this vendor, they noted differences between their 'standard' and their UUT of up to 20 dB. If someone asked you, 'How many apples do you have in your basket?' would you respond with something like, 'A few, plus or minus alot' or 'One, plus or minus 99.' Those are absurd answers of course, and what the vendor observed is a huge differential. Do we need to do something about this? I certainly think so. We need accurate equipment and accurate test hardware/software. I bet most of us take this accuracy for granted every day.
The standard defines RSSI as follows: 'RSSI is intended to be used in a relative manner. Absolute accuracy of the RSSI reading is not specified.'
This is because this is a wireless environment. If we could cable everything and have known accuracies and losses, then it would be absolute instead of relative. There are certainly ways to make meaningful measurements (as described above) to eliminate most of the inaccuracies, certainly with results better than +/- 20 dB. I would hope that we would be dealing with tolerances no worse than +/- 1-2 dB from a traceable standard.
Anyone want to comment?