Hello to all,
We have a scenario where we are using Cisco 1242 AP's with dual 802.11b/g and 802.11a radios. We have the same WLAN/SSID configured for each radio. We are a campus environment, so we have a wide variety of clients connecting. When monitoring connections, I notice a large percentage of clients connecting to the 802.11a radios rather than the 802.11b/g radios. Does anyone have info on how clients determine which radio/frequency they prefer to connect to? If clients have their supplicants configured for enabled a/b/g/ connections, is it simply a preferred list, or is there more intelligence behind it i.e. congestion/noise/client load?
We are using this AP configuration to accommodate for capacity, however if a large percentage of clients are jumping on the 802.11a radio then it somewhat defeats the purpose...
I'd appreciate any input.
Pretty funny that you would ask right after Devin posts an article called "Stations are Stupid"
His sentiment is correct though. The choice varies greatly depending on the client device, its driver, and its client software. I have seen some clients that actually have configurable settings for preferring one band over another, but not very often.
I wish I had more help, but this issue is exactly why many vendors are implementing features like band steering to help us utilize each band like we want to.
I have noted the same thing. This may or may not make a difference with the decision your clients make , but I think it can help in certain instances.
With clients, especially Intel based take a look at the drivers and update them.
Intel has a site that helps this take place automatically if you so desire:
go with what the ap vendor's recommendation for drivers, not neccessarily what's the newest(unless you are down for debugging/traces) if you are seeing issues or increased issues. i think the clients would most likely want to use the 802.11a band based on the likely hood that it's a less/congested frequency (less devices would interfere on these 3 UNII bands) or increased channels options (non co-channel) which would lend to more bw/usability. these are theories regarding why the client would choose a over b(g) (pun intended)
Unfortunately, I must agree with Matt. Go with what the Application supplier recommends.
Using the latest driver for generic things like web browsers may get you the BEST performance, but given the vagaries of things like roaming etc., changing the driver may put you in the state of NO performance.
We've all seen the problems that can occur when we try to re-install an older driver.
Now to be fair, the new wireless driver is usually (probably/hopefully) better than its older cousin, and its the application software that's behind the times.
Always try at least a small pilot run with new hardware or software before you make system wide changes.
I happened to notice one day in an experiment that almost all of the clients I was test were connecting to the b/g band. You probably are aware that the 2.4GHz band propagates better. Well I had the 2.4 at 18dBm and the 5GHz at 12dBm. The client preferred the 2.4 over the 5, even though the 5 GHz had a much lower noise floor. These were Dell laptops with intel 5100 and 5300 series intel adapters. I like what Keith recommends on this. Put the 5GHz and 2.4GHz on different SSIDs. I've found that putting the two radios on the same SSID provides broader compatibility but nothing in the way of load balancing unless you've vendor is cooking with some secret sauce.
you must control all the users in your network by making a server