Hi Ben and GT:
Please note that I have only asked questions and have not asserted either what the IEEE 802.11 standard says or what vendors have done with automatic channel selection.
However, this topic is a worthy one for further discussion under a new thread. I have posted this message in both the earlier and new threads. For cross reference purposes here are links to both.
"Why do we only use channels 1, 6, and 11?"
Ben posted in the earlier thread: "I think GTHill wrote what I was going to write exactly. APs on the same channel can arbitrate. APs on adjacent, overlapping channels (less than 5 apart) will interfere with each other. "
[I take it that "AP" is not critical to this statement, but rather the observation applies generally to any two stations not in the same BSS. Is that correct?]
What characteristic(s) of an IEEE 802.11 signal causes a receiver to treat it as a frame (subject to arbitration) or as interference?
1. channel center of the transmitter
2. signal strength at the receiver
3. CRC check at the receiver
4. distance the signal traveled
5. power setting of the transmitter
6. value in the "Channel" field of the physical header
7. value in the "Address 3" field of the data link header
8. values in the "FromDS" and "ToDS" fields
9. value in the "Address 1" field
I hope this helps. Thanks. /criss
Those lab tests you describe are hard to argue against. But I will try.
I think the lab tests that show inferior throughput when two IEEE 802.11 BSS's overlap frequency a little (e.g. channels 1 and 2) and superior throughput when two BSS's overlap directly (each use same channel) are misleading. My hunch is they probably all involve transmitters and receivers operating in the same room rather than at varying and substantial distances.
I think a receiver can sense and react to signal strength but makes no distinction between being tuned to the middle of the transmitter's channel or somewhere out at the edge. A signal coming from a far away transmitter centered on the same channel tastes just like a signal coming from a near transmitter centered say four channels over. Well, maybe the signal edges are a little fuzzier the farther the signal travels, but I don't see how this matters much to the receiver.
I think BSS initialization algorithms that test channel conditions and choose channels other than 1, 6, and 11 are doing the right thing. Their primary limitation is the short time to sample compared with the long time of operation thereafter on the chosen channel. New IEEE 802.11 standards for changing channels of an operational BSS as often as the managing device chooses will improve overall channel utilization for overlapping BSSs, and the BSSs will continue to choose wisely among all 11 2.4 GHz channels.
But then this is just what I think.
I hope this helps. Thanks. /criss
I was going to speculate as to how distance may or may not have an effect on this, but I think I will just try it in a somewhat live environment. What I would like to know is, what criteria would you like changed in order to provide a better test?
My plan was to run the test like I normally do, except to move the AP's farther apart such as a few rooms separation, but make sure that they can still hear each other since that is the basis for co-channel interference. I'll have a wired FTP server to a switch connected to both AP's. I'll have a client in one room pull off of its AP, and a client pull in the other room off of the other AP. Then, I'll change the channels and record the results.
Is there anything you would like to add? Also, do you think this is a fair test?
Sounds like fun!