Hi All good friends.
Little about me, i am not a RF guy, didnt had the chance to attend any cwna training (nobody is offering in my area), and i am just starting out in wireless. Already have 7 years of experience in wired so i am not new to networking but wireless is not my baby.
I need your guys help to understand airtime usage in wireless. Lets keep airtime fairness offered by different vendors completely aside. Lets talk only about pure wireless communication (and before 802.11ac just to avoid any confusion). I want to understand it at the core.
At wireless core, communication is half duplex because only 1 device can send packet at one time. My first question is very very simple but i need a pure technical authorative answer for my own curiosity
- Assuming we have 2 or more clients, what determines how much transmit time each client gets ? meaning, how much time does client 1 gets to transmit ?
- Based on above's answer, i may need to ask, is there anyway client 1 can starve client 2 and client 3 but not giving them any chance to transmit at all ? does it happens ?
Guys, by authorative answer i mean something that can be related back to facts provided in wireless standards.
This is a humble request by a guy who is dying out of curiosity since i am not able to find any explanation of above questions. Please help me to understand
If you don't mind, lets start with your question 2.
Jammers, DoS hacking tools, and incorrect CCA algorithms could cause the others to stop transmitting.
Do you want to ignore these, for now, and get into the more basic issues first?
Are you contemplating a "jabber- like" condition on Ethernet?
Thanks for question 2. Please help me with question 1
Have you read the CWNP 802.11 Arbitration white paper ?
That would be a good place to start if you haven't.
Thanks alot Howard, a great and superb resource shared. I really am grateful to you.
I am currently reading it and trying to make sense but i think i am able to narrow down my question. Please suggest if you can help me out with this understanding
1) STA's contend for wireless medium, there are backoff timers involved, NAV and duration involved etc. But what PREVENTS 1 client from starving the wireless medium ? i mean if 1 client is sending a large file, and he dont "want to stop" for the whole duration for the transfer, is it possible and applicable in 802.11 ? or are there any mechanism by 802.11 or taken by access points to ensure the maximum frame or time a single STA can transfer ?
That is exactly my confusion, what if one client doesnt want to stop ? are there any mechanisms in 802.11 that makes sure that one STA doesnt take up all the time ?
If a product follows the 802.11 standard, and especially if it is Wi-Fi Certified, the various CCA, and backoff timers will assure that doesn't happen. A Video QoS class may use up the most time, but nothing is going to use up all the time - if its a "legitimate" STA.
There is an older wireless DoS attack, that I don't remember the name of, that essentially just flooded packets. and different Jammers (er Antenna Testers) that could also be used, but in most cases you don't have to worry.
Like I said earlier, there are hacker tools that can do these things, but I don't have to worry about them at my job, so I don't pay much attention to that kind of thing anymore.
Most of the problems I see are from internally generated interference, often harmonic frequencies, that get back into the input of a radio in one way or another and ruin a radios dynamic range.
No its not regarding something in real life, but just for my curiosity.
I really appreciate your support, actually the thing is, backoff timer comes into play after a frame is transmitted but STA already signals that its going to send more frames, so other STAs wont send until first STA has finished, so is there any limit of size or number of frames that a STA can send in one second or something ilke that ?
AP firmware will be sharing its buffers etc with multiple STA addresses, so the amount of time before it gets around to the first STA will be greater on busy AP's - there will always be some delay. The more work going to the AP, the bigger the variations - assuming unicast frames..
Add to that, some AP's using an Airtime Fairness, or proprietary QoS algorithm might be throttling frames too.
I have seen some communications software in the past, not Wi-Fi, that unknowingly used LIFO scheduling, that totally messed things up. But no programmer worth their salt would do that.
Sometimes it helps for programmers to understand queing theory or tele-traffic engineering, but I don't know if either's taught in school to today's programmers.
Thanks Howard :)