Forum

  • By Howard - edited: January 11, 2015

    I am having trouble understanding the Listen Interval. I understand the idea of an AP purging frames if it starts running out of buffer space, but ...

    If a client were in power save mode and had a low Listen Interval, couldn't its connections be dropped, or severely disturbed. Especially if it were ignoring an AP's DTIM count.

    This is not just a potential throughput problem, but might also cause difficulties with roaming.

    From my reading of the 802.11 standard, it seems that if a clients Listen Interval were less than an AP's DTIM Count, that an AP might delete frames that it had been saving.

    The Listen Interval is expressed in units of Beacon Intervals, so if we had a listen interval of say 7, but the AP had a DTIM counter of 10 then this might happen.

    If nothing else, it almost seems that the AP would be better off ignoring a clients Listen Interval, and just have its own algorithm that it implements.

    If anyone has experienced problems with either Listen Interval settings, or AP design in this area, I would appreciate hearing about it.

    Thanks.

  • Very interesting point here. On initial association with an AP, a client tells the AP what it's Listen Interval is, in the Association Request repsonse frame. The spec says:

    ?An AP may use the Listen Interval information in determining the lifetime of frames that it buffers for a STA?

    This could be interpreted as ?it?s up to the manufacturer, re the implementation?

    It would be interesting to see some experiments with different manufacturers gear to see what happens.

    Dave

  • Found this:

    http://www.wifi-insider.com/wlan/psd.htm

    "The AP shall not drop any queued frames until the STA's Listen Interval elapses."

    Don't know where that comes from though.

    Also:

    http://linuxwireless.org/en/developers/Documentation/ieee80211/power-savings

    Dave

  • Thanks Dave. Still not much closer to an explantion on this however.

    I should have mentioned that the CWAP book is messed up on page 151 re. Listen Intervals.

    Just added that to the CWAP errata.

  • Listen Interval (minus one) is the number of beacons that "dozing" STAs are not listening for beacon frames.
    I suppose that this applies to DTIM beacons having BC/MC too, so Listen Interval affects both unicasts and multicasts. [i][b]Please correct if I am wrong[/b][/i], because I have also read (somewhere) that LI only applies to UCs, while DTIM is only for BC/MCs; in other words, [i]despite Li setting[/i], STAs [b]always[/b] wake up before a DTIM so they can get the BC/MCs.
    Listen Interval is set on STAs and "told" by them to APs, while DTIM is set on APs and STAs learn it from the beacons.
    So common sense tells us that LI should be lesser than or equal to DTIM; otherwise BC/MC would be pretty hard to be received :) ! -unless I was wrong from the start...
    On the other hand, to maximise preserving power, STAs should be sleeping most of the time.

    This yields that, in order to comply with [b]both[/b] points, [b]LI = DTIM[/b].

    However, this would imply a [b]pretty bad performance and a huge amount of required memory on the APs[/b]. I remember (old Symbol Technologies times) that DTIM was always set to 10 to save power on terminals (MUs, we called -and still call- them). This is a whole second of time. Think of all the data that might be buffered during this [i]eternity[/i]!

  • Your analysis agrees with mine.

    We have seen many Symbol AP's running with their DTIM's set to 10, and it is the cause of many roaming issues.

    The reason I posted this in the first place is because IMHO it makes no sense at all for LI to be > DTIM. But I don't see anywhere that this is specified, or even recommended.

    I also agree that LI = DTIM makes perfect sense, but then why have two parameters?

    I am of the opinion that:

    1) either manufacturers worked this out long ago and don't give it a second thought these days, or
    2) there is a glitch that pops up every once in a while, which causes some problem, and users just
    (incorrectly) attribute it to "all wireless" networks.

    I've usually found that when some small detail like this is undefined, that sooner or later it at least causes a "glitch" - which is why my bet is 60 % favored to case #1 above.

  • I still ignore if LI applies to both UC and CB/MC or only to UC. And this is very important!
    I need gurus' knowledge.

    If LI is the [b]only[/b] criteria/timer for STAs to wake (i.e, applies to both UC &BC/MC), so its fine tuning is critical for their correct behaviour. They just can't miss too many beacons with DTIMs and/or TIMs with their AID set.

    On CWNA (the one I will be tested next friday) book, pg. 305, it is said that "[i]All stations will wake up in time to receive the beacon with the DTIM[/i]" so one can suppose that [b]any STA must also wake up every DTIM beacons, despite of its LI setting[/b]. But it's not quite clear, at least for me. This would require two independent timers.
    Does anyone out there know the truth?

  • Jamrb,

    Because this is getting into pretty esoteric territory, I would doubt that anyones exam would get into such convoluted issues.

    But from my point of view it is a legitimate question, and an answer might explain some Client behavior.

  • I was not asked about in my exam, this morning. I had, however, to identify an image of a connector (for God's sake!).
    Anyway, I was lucky and I passed it. With a non-impressive 83%...
    Thanks for your support, gentlemen!

  • BTW 83% is a really good score. Be proud of it !!

Page 1 of 3