Partly Cloudy, Chance of Isolated Brainstorms (by Paul W. Smith)
Free Wireshark Training in 4 Cities from Laura Chappell (by The Oldcommguy)

Wireless Overkill Can Cause Poor Performance (by Chris Greer)



When wireless networks were first implemented, one of the primary goals was to provide adequate coverage to the environment. Users wanted the ability to browse the web, check email, and access applications, without the tether of a wired connection. These apps, along with many other tasks an average user needs to perform, don’t generate much overall bandwidth on the network. Unless files are being backed up or downloaded, the average user only will generate 300Kbps when driving basic applications. Since bandwidth demand is low, when wireless networks emerged, they did not need to provide maximum performance in order to keep the average user happy. 

Things have changed.

Today, mobile, high-bandwidth applications such as voice and video have emerged on the network. These apps have driven the need to improve wireless efficiency and performance. Don’t be fooled by the sticker on the AP which boasts of up to 54Mbps. At best, because of the CSMA/CA method used by wireless, along with all the overhead, we can only expect half of the advertised speed in actual throughput. If there is significant interference, or if the wireless environment is not optimized for maximum throughput, bandwidth greedy applications will work poorly, or sometimes, not at all.

Recently, in one environment, a wireless video application required a minimum of 3Mbps of throughput. Wireless coverage was not a problem, as there were several access points to adequately cover the floor. The access points were configured for channels 1, 6, and 11 in the 802.11b/g range, but in any one spot, a client could detect 10 or more access points.

There was so much RF in the area that when a station connected to the network, there were several access points providing connectivity on the same channel for a given SSID. When testing throughput in these areas, we were not able to achieve much more than 1Mbps, even though there were few clients in the area. Moving to a location with fewer access points for a certain channel, we saw throughput climb to 7Mbps or better.

This environment is an example of wireless overkill. Too many access points are deployed, providing good coverage, but poor performance. It would be better to have fewer access points, placing them in more strategic locations for good coverage, without multiple APs accessible on the same channel from one spot.

To illustrate this, what if several radio stations in a city were broadcasting simultaneously on 94.5 FM. The listener would hear several songs on one station, which would make clear listening difficult. In a similar manner, this is what a wireless client experiences when there are simply too many APs with same channel coverage.

So when deploying new applications on the wireless environment, make sure to perform a good site survey before installing new APs, and run throughput tests to be sure what clients will experience from several spots. It may be in your building that less is more when it comes to APs.


Continue reading other LoveMyTool posts by Chris Greer »


Chris_greerPacket Pioneer Logo Author Profile - Chris Greer is a Network Analyst for Packet Pioneer. Chris has many years of experience in analyzing and troubleshooting networks. He regularly assists companies in tracking down the source of network and application performance problems using a variety of protocol analysis and monitoring tools including Wireshark. When he isn’t hunting down problems at the packet level, he can be found teaching various analysis workshops at Interop and other industry trade shows. Chris also delivers training and develops technical content for several analysis vendors.

Chris can be contacted at chris (at) packetpioneer (dot) com.