Forum

  • By Tony-T - edited: November 25, 2019

    Hello There,

    It is accepted that changing channel bandwidth from 20MHz to 40MHz will degrade the AP <-> client link SNR by 3dB.  (Formula available here (and many other places) - https://www.electronics-notes.com/articles/basic_concepts/electronic-rf-noise/thermal-noise-calculations-calculator-formulas.php)

    And, if I was to say that this is certainly true for the receive end, but may not be the whole story.

    What would others, think?

  • I would say what else makes it the full picture? Instead of being vague, tell us what you think contributes to it.

  • By Tony-T - edited: November 25, 2019

    As channel bandwidth is increased:

        On the receive side, the aggregate noise collected will rise and SNR will decline.

        On the transmit side, the same amount RF energy will be spread across more spectrum.
        That, I ponder, would result in an overall lowering of the transmit signal power relative to the noise floor
        While the power transmitted remains constant, whether spread across 20/40/80/160MHz of spectrum.
       The waveform of the signal level as shown on a spectrum analyzer would decrease as bandwidth is increased.
         (Would be nice to be able to add an image of a spectrum analyzer screen in this forum)

    What I am seeing as a consequence is that as bandwidth increases, SNR reduces, with contributors at both ends (Tx & Rx).
    RX end is well known and accepted
    Tx end, the RF signal value would decrease as bandwidth increases (with tx power etc held constant)

    The Tx RF energy is finite, and there could be a trade off between bandwidth and absolute Tx signal level, as seen on a spectrum analyzer (the signals height above the noise floor).

    Like a torch that can be focused:

    • The bulb light output is limited and fixed
    • When focused to a small dot, it is brighter
    • When focused to cover a larger area, it is not as bright / is duller

    Your thoughts?

    Regards

    Tony

    Or to use different wording -

    The RF transmission has finite energy to radiate in a given portion of spectrum - would not 20MHz b/w be more energy dense than 40MHz b/w, providing a higher signal strength (at the point of origin)?

  • By Howard - edited: November 24, 2019

    A few observations:

    The ability to add diagrams etc. was tried for a while, but was shortly thereafter removed.   I'm guessing due to disk storage requirements.  The inter-member private message facility also was probably deleted for that same reason.    Sad, but a fact of life I suppose.

    There also used to be a general Forum notification feature, which now only works for contributors to individual posts.   I'm sure they got many complaints about this feature, once they stopped limiting spam.

    As a limited response to your original post(s), I think the situation is more complicated.  It is true that amount of noise potentially increases as the bandwidth increases - by definition.   However, the actual SNR may NOT increase, depending on local channel conditions.   

    The minimum SNR requirement in the standards  increases by 3 dB for each doubling of channel bandwidth (20,40, 80, etc) - at each particular rate.   But that does not mean the actual SNR is decreasing.

    It is well known, and is fairly easy to demonstrate, that throughput can DECREASE when 40 MHz wide channels are specified over 20 MHz channels - due to channel overlap, and sometimes to devices ignoring "40-MHz intolerant" settings.

    Many applications work as well with 20 MHz as they do with 40 MHz.   Many small hand-held devices, and other less expensive devices, can use 40 MHz channels but don't work at multi-stream rates as they don't have MIMO radios and antennas.

    Note that as O/S bloat has increased, by 20x in some cases, that wireless firmware download durations have gotten totally ridiculous.   Which has drastically curtailed some companies device management and Cloud access offerings.

Page 1 of 1
  • 1