The Cisco power reduction makes sense, as the 40 MHz is approx double the 20 MHz bandwidth ( I say approximately, as there is a guard band between any two adjacent channels, and a 40 MHz signal "covers that plus the two channels on either side of the guardband?....RF channels not being "square boxes which can be added together to produce a "box of double the size"....an RF channel when seen on a spectrum analyzer is a complex shape. When we have two RF signals present concurrently within the same physical bandwidth space, the FCC is more concerned with power spectral density than anything else).
Cancel that last one. Just re-read your posting Bruce ( was busy scoffing a chicken sandwich instead of paying attention...). That's a bit unusual. In many other (non-WiFi) types of radios when you change from "single bandwidth" to "double bandwidth", the physical transmit power often goes up by 3dB. Don't know why they have dropped it down by 3 dB. At the very least, you would imagine that it would stay the same.
This i still bugging me and I haven't got a satisfactory answer from the vendor. They've told me the power limit is a reguatory compliance.
Trapeze's 3x3 11n AP (MP-432) can only transmit at 13dBm on 2.4Ghz in the UK.
Could you guys that work with other equiment post what they're max transmit power is, preferably on UK country code.
I originally looked at this as a design issue, but I've found it's really more of a compliance issue/topic - except how these disciplines inter-relate of course.
Here is another ATCB document and two FCC documents on the subject. At least in the US it appears related to the Antenna system in use, whether its PTP or MPTP, "broadcast" or not, and whether you are using beamforming. The techniques and formulas used in the UK, are probably very similar to these, so based on the 802.11g limits there, you can probably see how they are calculating power limits.
I'm not normally looking for "compliance" documents, so these were not easy "find's". I went through another 10+ other documents that had nothing to do with this question, even though they looked like they might from their titles.
I haven't had enough time to completely absorb all the material yet. So for the moment, I'm not making any pronouncements about measuring MIMO power levels.
Fred Niehaus of Cisco (smart guy) explained this to me a while back. Does EVM relate to Spectral Density?
I'm looking at the power levels on the 1140 radios and amazed at the variations in power by data rate. These are in addition to the UNII-band EIRP rules, with some additional antenna gain assumptions on Cisco's part.
Are these really FCC-regulated levels? Does MIMO/MRC/ClientLink overcome these limitations to deliver higher sustained legacy rates at range?
Active power levels by rate
6.0 to 18.0 , 14 dBm, changed due to regulatory maximum
24.0 to 36.0 , 13 dBm, changed due to regulatory maximum
48.0 to 48.0 , 12 dBm, changed due to regulatory maximum
54.0 to 54.0 , 11 dBm, changed due to regulatory maximum
6.0-bf to 18.0-b, 14 dBm, changed due to regulatory maximum
24.0-b to 36.0-b, 13 dBm, changed due to regulatory maximum
48.0-b to 48.0-b, 12 dBm, changed due to regulatory maximum
54.0-b to m6. , 11 dBm, changed due to regulatory maximum
m7. to m7. , 10 dBm, changed due to regulatory maximum
m8. to m14. , 11 dBm, changed due to regulatory maximum
m15. to m15. , 10 dBm, changed due to regulatory maximum
m0.-4 to m3.-4 , 14 dBm, changed due to regulatory maximum
m4.-4 to m4.-4 , 13 dBm, changed due to regulatory maximum
m5.-4 to m5.-4 , 12 dBm, changed due to regulatory maximum
m6.-4 to m6.-4 , 11 dBm, changed due to regulatory maximum
m7.-4 to m7.-4 , 10 dBm, changed due to regulatory maximum
m8.-4 to m11.-4, 14 dBm, changed due to regulatory maximum
m12.-4 to m12.-4, 13 dBm, changed due to regulatory maximum
m13.-4 to m13.-4, 12 dBm, changed due to regulatory maximum
m14.-4 to m14.-4, 11 dBm, changed due to regulatory maximum
m15.-4 to m15.-4, 10 dBm, changed due to regulatory maximum
Re: ASK THE EXPERT - 802.11n RATIFICATION
Yes this power levels are real (don't be amazed) it's pretty much the same across the board with our competitors as well. What you are seeing here is not an FCC regulated limitation but rather one of PoE. When we design products, such as the 1140 we design to a power of approx 12.5 Watts (yes 802.3af is 15.4 Watts) but the device is designed less as there is loss in Ethernet cable etc. As the data rates go lower the transmitter power goes up since the transmitter EVM limit is relaxed.
EVM is the linear or distortion factor, the higher the data rate the less distortion is tolerated. Similar to receiver sensitivity gets better as the data rates go down (since it can decode better through the distortion).
If you have a need for higher transmitter power, take a look at the AP-1250 product which can accept a higher PoE rating (beyond that of 802.3af) using our power injector.
We definitely need to add POE requirements in there as a consideration/complication.
I was just speaking to our compliance person here, and not only are they throughput issues (using multiple Ethernet lines), but as he points out they are also safety issues. There are NEC codes covering all this, but then there are state and municipal codes that are more stringent, that complicate it even more.
It sounds like mfgs' are sometimes setting their radios to the lowest common denominator, when in actuality it would be nice if they could leave it up to the installer/administrator to set, based on local regulations -
[i]until the first time something burns down and the device mfg. gets sued because of some overzealous salesman, customer, or reseller.[/i]
What a pain - you could have two identical buildings in different counties with the same equipment, in the same environment(s), but with different electrical/safety rules, and one works fine and the other doesn't.
Personally, I don't have to worry about POE requirements, because all of our products are either battery [b]or[/b] mains operated - nothing is POE.
But I still have to worry about emissions, so I'm still looking into all of this.
EVM stands for ?Error Vector Magnitude?. When we have systems that use phase modulation (whether it be PSK, QAM etc), there is an ideal (or reference value) of what value of phase should be present for a particular set of modulation factors. Due to various factors (phase noise on oscillators etc), the final, actual phase value may be different from the ideal at any point in time. Sophisticated test instrumentation can measure this ?error value?. You can think of it (very, very roughly) as a form of ?jitter? on the signal.
The effects of increasing power on digital signals can become quite complex, with issues such as ?non-linearities? coming into play.
here is a link to a pdf with the power limits for Cisco's new 3500e or 1260 APs:
Very useful chart. We can see over all the domains Tx max does not go above 20 dBm and we can see the total power values given by adding 3dB to equal values of two transmitters.
Take a look at some of the entries at band edge for reg domain A. Some are less than the values given in the rest of the column ( e.g. channels 1 and 11). This may be due to strict FCC regs regarding "spillover" from spectral re-growth at high powers. That may be why they have dropped the values down compared with max.
Wow, what complication.
Absolutely that's why they're lowered. Isn't power output under software control a wonderful thing ?! Otherwise they may have had to do something dumb, like cut down every channels power, not just 1 and 11.
Lets remember though, that these are Cisco's numbers for [b]these[/b] products - not neccesarily the exact maximum in any part of the spectrum. It all depends on the specific device, and [b]its[/b] behavior. Different hardware configurations (case, antenna leads, power source, ground, etc.) may allow a mfg. to trim power outputs differently. And of course, don't forget POE considerations.
Tables 20-21 and 20-22 in CLause 20 of the 802.11 (2009) (ie 802.11n spec) have different allowances for constellation error and sensitivities which are pretty revealing too.
My WLAN tester shows the error figures are pretty easily met, and the actual sensitivity values from the different chip mfg's are getting better every year.