In one environment I've noticed a lot of outdoor wifi, outdoor mesh and municipal wifi, bleeding into the building. It's a steady trickle of traffic, mostly around -80 - 100dBm. Many of these frames have extremely high duration values, ranging from 15ms to 60ms. The vast majority of the frames have CRC errors, so I'm not sure if the duration value can even be trusted. However, it the duration is correct, will clients defer access if they see these corrupt frames? If so, it would seem that this could cause large spikes in latency.
Sidenote, I am using Wildpackets Omnipeek and their adapters for the packet captures. The adapters seem to be more sensitive than my laptop, so it's possible that my laptop isn't even seeing some of these packets due to lower receive sensitivity.
That's an interesting question.
I would bet that because the CRC is invalid that, the frame is not processed fully (ignored). However, Energy Detect mechanisms could delay transmission regardless of the CRC's validity. However, it doesn't sound like most of the signals you are seeing are strong enough to do that indoors.
But I could also imagine that, based only on the duration value, that a WLAN card could run off to do something else. Some type of performance analyzer card/firmware might use this technique to its advantage, where the only "data" it cared about was the duration field. It could have already measured the power level based on the Preamble field.
I am always surprised by the sensitivity variation in client cards. Even dongles from supposedly reputable companies are often poor. I would guess that many of the difficulties actually result from the antennas, rather than the chips inside the dongles. Making dongles smaller and smaller is not conducive to making effective antennas.
I have not used the Wildpackets branded dongle, but our Air-Pcap adapter seems to run slow. I find the Linksys AE1000 and AE6000 adapters to work fine. In the office I stick them in an older Zyxel USB adapter to get them up in the air a little.
I was fairly certain that I read something in the CWNA book that lead me to believe the STA would effectively ignore the frame. I'm struggling to find that passage now after searching my book. Your response is encouraging a bit, and leads me to believe I'm not imagining things. I did consider the energy detect mechanism, but from the captures I've done, we're well below that threshold.
I'm continually frustrated by the differences in signal strength between devices. During surveys I always do spot checks with the laptops built-in adapter to see what the difference is between it and the survey adapter. Otherwise, my survey could be off quite a bit. Argh!
Remember that EVERY manufacturers RSSI values will be different.
This is still the case for devices using the same chip. Just because it has the same radio does not mean they have the same RF amplifier circuit, firmware, or antenna.
The only manufacturer I trust, unless I test it myself, is Cisco on their AP's. Even then, they have to make adjustments at both ends of the scale.
We have tested MetaGeeks numbers against calibrated gear and have gotten good, and perhaps more importantly, consistent results with those.
This is a good question. One which I plan on thinking about and researching more.
My thought on this is it is very possible that your station and ap station are contending with the tx station's signal.
All your stations and ap stations have to have CCA (physical and virtual carrier sense) before tx op. If they hear another station's tx and can demodulate the radio frame then it should update its NAV Timer.
Whole purpose is to protect all stations and give every station the opportunity to tx and rx at some point in time.
Simply put. When one station speaks. All others have to be quiet. IF they can hear the speaking station to begin with.
The virtual carrier sense cut off is -82 or -85dBm on the same frequency.
I hope this helps.