Last Post: March 10, 2011:
If I have both AP and one wireless station using 300Mbps 802.11n products, assuming no interferences and optimal conditions, what is the maximum real throughput?
Is it better to go wireless in this case or wired at 100Mpbs?
Real throughput will vary, but "maximum" in the real-world is not usually more than 150 for a single client. Perfect lab scenarios can get near 200, but you'll never see that in the wild. When you're comparing wireless with wired, the big issue is use case. Every situation is unique, so you can't really say it's better or worse without knowing more. If you're talking about pure one-way throughput, you may get better with wireless than with fast ethernet. However, FaEth is full duplex, so if you have application data going both directions, it's possible for your total throughput to be better. Don't forget, every time you add a client to your wireless network, your per-client throughput is roughly halved, but you don't have to run any more cables or provide any more switch ports. :)
And to achieve the "aggregate" 300Mbps data rate, you would be using 40 MHz channel width in the 5GHz spectrum? We are currently using 20 MHz channel width and am thinking we need to move to the 40 MHz channel. Are there any other factors to consider when determining which channel width to use in an enterprise environment?
thanks in advance,
Yes, 300 Mbps requires 40 MHz channels, which is not recommended in 2.4 GHz. Assuming you're planning a network for 5 GHz and asking about 20 vs. 40 Mhz channels, the big questions to answer include the available channels in your location, regulatory domain, and within your products. Also, you should ask what percentage of your clients are 802.11a vs. 802.11n. If your client devices and applications can take advantage of the extra throughput and moving to 40 MHz channels doesn't constrain your channel reuse plan, there's no significant reason not to move to 40 Mhz channels.
Well, we currently have a mixed 2.4/5 GHz environment with the 5 GHz using 20 MHz channels. I see your point and see no reason why we should be using the 20 MHz channels. Thanks for the info, much appreciated.
Is this a correct conclusion: in all normal environments, 40Mhz in 2.4Ghz will not be allowed by the AP due to other 2.4 Ghz networks in the vicinity, so 300Mbps also will not happen.
What is the max throughput one can get in 2.4Ghz with n products then?
I think it boils down to channel use in 2.4GHz. If you're using 20MHz channel width, you're working with the non overlapping 1, 6, and 11 channels. If you use 40MHz channel width, you only have 3 and 9 to really work with. I believe that 144Mbps would be the max throughput with 2.4GHz. Someone please correct me if i'm wrong :-)