Under RF Communication Principles, in pages 284 - 286, the new CWDP book discusses Amplifier Saturation, Backoff, and Peak to Average Ratios (PAR).
I really enjoyed this section, but the amplifiers it discusses are putting out lots of power - say 30 dBm. The discussion notes that 9 dB Backoff is a good rule-of-thumb for OFDM (4 dB for DSSS).
I'm interested in lower power devices - say 17 dBm output tops. A device I was looking at recently, had a 1 dB compression point near 15 dBm.
In this case, what would be a good rule-of-thumb for the Backoff value ?
The amount of backoff is the same regardless of the output power. If you have a 1dB compression point of 15 dBm for OFDM you want to be backed off at least 9 dB from that power or an average power around 6dBm (15-9 = 6). That way when the power peaks 9 dB higher than the average, you are just at the 1dB point. You will get a little regrowth, but it should be minimal.
Thanks for your response Tom.
Your explanation would be my take away too, if the 9 dB figure is correct. But I see this signal behavior all the time, and I just can't resolve the conflict between practice and theory.
I have tested several makes of radios that run at least 3 - 5 dB hotter than 9 dB of back-off would require and they still seem to work ok.
It seems like many manufacturers don't really care (understand ?) about "good quality" signals. If they really care, it's about not exceeding the FCC limits, which overall don't take signal quality into consideration.
I test using PN7 and PN9 randomized bit streams as is often recommended. Perhaps if I tested with "uglier" frames, I would get different test results.
A bigger peeve of mine is poor EVM. Luckily, for me, bad EVM numbers usually produce a reduction in range. So the two indicators support each other and the engineers have to address the problem - whether they want to admit to a problem or not.
Generally I would agree that you can push the average power higher than 9 dB backoff. The adjacent channels will suffer some noise degradation. If you dont have nearby radios operating in the adjacent channel, this may not bother you much, but it may bother your neighbors. The distortion of the in-band signal will effect the higher modulation rates more than the lower ones. Often manufacturers of client devices will push the average power higher than they should. There is a cost benefit for doing so and their margins are thinner for clients than more expensive APs.
It is possible to reduce the PAR of 802.11 OFDM signals to 4 dB. I was reading a recent paper "Peak-to-Average Power ratio of IEEE 802.11a PHY layer Signals", A.D.S. Jayalath and C.Tellambura :
where they discuss a technique. It is possible that more modern chipsets are utilizing these kind of techniques to lower PAR and costs. it might be interesting to measure this to see.
Thanks for all your input.
I had forgotten about this article. You might also be interested in the article entitled Peak-to-Average Power Ratio Reduction in OFDM Systems using Huffman Coding by Eltholth and Mikhail, etal.
Unfortunately my company only has control over the channel selection, individual channel power, and channel width - within regulated ranges of course. We can't affect the clipping levels. Interestingly, if you compare some of my I/Q plots with figures 6 and 7, of the article you reference, it does appear that those radios use clipping on the OFDM signals. The "corrected" data plots look interesting too.
I have been thinking about performing Adjacent Channel Rejection tests (again), but this time using our devices as the intentional interferer. We already have FCC test modes that I could use to generate signals.
I really need a golden radio to test with. In the past I had another signal generator, but it was only for 802.11b - and was not so "golden".