quick menu

Why your Wi-Fi is slow

Why your Wi-Fi is slow

SOLVED
Reply
Highlighted
Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 1 of 34
(159,404 Views)

Heya Verizon, I'm posting this in the Verizon FiOS Internet forums since this seems to be the place that attracts the most posts in regards to poor Wireless performance or range. What I plan to do in this thread which I will create in parts is to describe the many different issues that need to be taken into consideration when dealing with Wireless networking.

 

To start, let's first go on the basis of the two service bands and also the technology available for use today. Wi-Fi is known to run in the 2.4Ghz and 5Ghz band. 2.4Ghz band is the most commonly used service band and has exceptional range to performance characteristics which makes it the band of choice. 5Ghz is a service band with more allocation of spectrum to it, offers exceptional speed for data and also allows for high density wireless networking due to the significantly shorter range it has in comparison to it's big brother.

 

For Wireless technologies, to date we currently have Wireless A (5Ghz, old), B (2.4Ghz, very old), G (2.4Ghz, old), N (2.4Ghz and 5Ghz, new) and Wireless AC (5Ghz, draft). Each technology is ordered in age as follows: B, G, A, N, AC and each has a set of specifications that they follow that were based around previous generation Wireless standard with some additions, and with this comes some Backwards compatibility support: Wireless N is compatible with Wirleess G devices, which is compatible with Wireless B, and Wireless AC is compatible with Wireless N which is compatible with Wireless A. With backwards compatibility comes some downfalls.

 

Let's first start off by discussing what the key factor in Wireless reception and performance is, and that would be the quality of your signal. The quality of your signal is affected by many factors, such as nearby signals, the quality and grade of the antennas being used, the quality of the radios being used, the configuration of the gear, and the ability of Wireless signals to propagate amongst all things. In the case of Wi-Fi, all of this is quite important due to the high frequencies being used and the low power also being used to comply with FCC regulations and safety standards.

33 REPLIES
Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 2 of 34
(159,387 Views)

The quality of your signal is measured by a multple of factors. The first would be the available spectrum and the amount of signal to noise present in the service band you are using. Today, 2.4Ghz for example is heavily used by many appliances. These appliances include Microwaves, Baby Monitors, and Cordless Phones. In addition, the service band is also heavily occupied by many Wi-Fi enabled devices that grow at an increasingly fast pase, Bluetooth devices such as wireless mice and keyboards or video game controllers are notable, everyday devices being used, and also security cameras.

 

5Ghz is a lesser used band and has a few applications for which it is used for. For example, Military Radar can be seen in this band but also an abundance of Wireless networks and devices are starting to appear here as well. It's reasonably clean as well due to the fact that signal propagation is less of an issue here, and also due to the lack of common, everyday appliances using this band.

 

With the rise of high speed Internet service and an increase in Wireless dependance, the 2.4Ghz band is certainly getting crowded. While not a perfect example, here's what might give you an idea of what my Wi-Fi network has to put up with:

 

wifi2.PNG

 

This is a screenshot taken from a program called InSSIDer which is available for Mac and PC, and is a tool capable of being used to pick up Wireless networks using your computer's Wi-Fi adapter. I took a quick walk around my home with a laptop equipped with a high quality Intel WLAN adapter which is capable of Dual Band functionality (As in, it can use both 2.4Ghz and 5Ghz Wi-Fi networks). Notice, each of the networks on this list are spread across the 2.4Ghz band, which has Channels 1-11 available for use in the US (If you're in Japan you get to use Channel 14/verything), otherwise three non-overlapping channels assuming everyone is being a good neighbor. At the top of the list you see my Wireless network which is broadcasted from a common ActionTec MI424WR Rev. D router seen on the FiOS service.

 

On a deeper look, notice how many networks that are around which are using the 2.4Ghz band and have to share 11 possible channels, otherwise "60Mhz" of bandwidth in total. As you may have seen throughout the Internet, a situation like this is often found in apartments. In my case, I'm in a typical suburban neighborhood rather than an apartment and here we see that the 2.4Ghz is surely a mess. Besides home appliances, cordless phones and tools, along with Bluetooth devices, each of these networks has to cope with each other network trying to share the same spectrum. In addition, each network has to distribute what it is able to dole out across all devices (more interferers for everyone!) connected to that network.

 

During the weekends, my Wi-Fi service starts to get poor as everyone is home running their appliances and devices that operate on the 2.4Ghz band. While I normally obtain 28Mbps during the day time on a Wireless G link in a single direction, towards the night hours speeds drop to as low as 500kbps in a single direction and packet loss and latency on the Wireless becomes a huge issue. This is simply due to interfernece and some protections built-in to the 802.11 standard which I will talk about later. As you also notice, it doesn't matter what channel I pick either. There are a ton of networks on Channel 6, which I've tried and was extremely poor during the day, and Channel 11 has it's own fair share of interfernece on it. So, I'm stuck with Channel 1 and what I can muster out of it.

 

To add into the equation, some networks, such as the ones seen as running at 300Mbps or 450Mbps are using what is known as "Wide Channel" mode and this was introduced first in the Wireless G standard as a proprietary method, and then into the Wireless N standard. These wide channels can span 40Mhz out of the available 60Mhz of spectrum in the 2.4Ghz band, and while they do deliver more performance, they also result in a severe performance degration for everyone. While walking around my home, I also saw these networks showing a double wide channel even though InSSIDer in the screenshot is not showing that in action.

 

In my case or in this case, what is the solution to poor Wireless performance? If I only had the choice of Wireless or Wireless, I would ultimately have to settle with the 5Ghz spectrum. The laptop while roaming throughout the house did not pick up anything on the 5Ghz band, and a spectrum analyser shows almost nothing on there. Smooth sailing for me when the switch is made. Should I be able unable to use 5Ghz, my only other option short of hoping for a power outage would be to work to further adjust my router, which is loaded with DD-WRT and includes some VERY powerful Wireless settings to cope with noise and high density situations better, but even then this isn't ideal as it would only help with the router's end, not on the other end which could be anything from a phone to a laptop. I could also try pairing up different radios and antennas. Antennas of lower gain will encounter less noise, but will also give less range. Some radios due to design have better tolerance levels to noisy situations and can cope with situations such as this.

 

Now in other cases, things may be far more forgiving. You may be able to simply change your router's channel and get your performance back, even if another network is using it. If this works, then great!

Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 3 of 34
(159,373 Views)

In many cases as mentioned above, poor Wi-Fi performance is caused by Interference, but here's where we have some more figures into play. The 802.11 standard has some features built-in to it after the Wireless B days which are in place to cut down on collisions that occur. Now, you may wonder "How do collisions happen on Wireless? I thought this was only relavant to Ethernet hubs!" Here's where that thinking is correct, and we'll dive into it further from there.

 

By design, most Wireless technologies are similar to a Two-Way radio system. Take two radios and give them to two individuals and tell them to communicate with each other. In order to transmit a signal effectively, there must be silence. One person has to press the "talk" button while the other person has to be on the same channel, listening in for traffic. Let's call one person Access Point, and one person Laptop. Access point says the message "Hello!" and laptop, who is sittng there awaiting a message receives "Hello." Now, say for example, Access Point and Laptop both want to speak at once. They both say "Hello!" but neither one of them receive the message. Why? Well, similar to a hub, only one device can be using the spectrum at a time in many cases. if two or more devices try to use the spectrum, a "collision" occurs and data is lost. Let's add a third person, and let's call them Xbox. Access point is waiting patiently for a message, and both Xbox and Laptop who have a Two Way radio capable of talking to Access point speak at the same time on the same channel. Access Point this time receives the message, however they are unable to understand what was sent since both signals came in at the same time. Access point then tries to piece apart what both stated, however due to how long Access Point was taking to determine what happened, they "drop" that memory of what arrived and return back to a state of listening. Once again, another collision. Now, let's say these Two-Way radios are capable of using three different, unique channels that do not interfere with each other, but now let's add 20 parties of individuals who have to pick a channel to use. Clearly, there are not enough channels to use and as a result, a high amount of collisions occur and what you essentially have at this point is noise coming in for everyone listening, and this slows down the listeners as they try to determine what is for them.

 

This is a simple example that explains the concepts of how 802.11 Wireless operates. Collisions result in data loss, which reduces speeds. Noise can further cause collisions to occur if it is abundant enough, or it can be loud enough to make messages become downright lost. Fortunately, 802.11 Wi-Fi was designed with reliability in mind as well, which means that the standard incorporates codes that signify when it is safe to send, and who can speak, and it leaves the job of handling that up to a master person, in this case, Access Point from our above example. With this, collisions are vastly reduced and speed and data delivery reliability improves. Now, let's throw some noise in. While collisions are reduced, noise is not. Each party in the mix above speaks a different langauge, and as a result, while Access Point is able to control his party, he is not able to control the other parties who are performing the same routine but in another language. As a result, Access point has to try his best to get data sent and received between everyone, but eventually noise will cause him to fail similar to a collision.

Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 4 of 34
(159,364 Views)

Remember in my last post how I talked about this thing called duplexing, where only a single person can talk and only a single person can listen? This is also something that affects the performance of Wi-Fi. Let's use Wireless G as an example, as it is the simplest standard, minus Wireless B which is an animal to follow for this example.

 

Wireless G operates at 54Mbps at a theoretical rate under optimal conditions. Each radio has a send and receive link which is negotiated as a part of maintaining a Wireless connection between two locations. For example, your Wireless card in your laptop could have a 54Mbps send rate, and the connection from your access point to your laptop could also be at 54Mbps. Why doesn't this happen, though?

 

First deal, Wireless has a lot of overhead which includes error checking and protocol methods used to drive the wireless connection in the first place. Error checking is used to increase the reliability of the connection and is meant to allow damaged data to be rebuilt mathematically so that each bit of your data is delivered regardless of an occaisonal error in what you could call the "Main stream." Add in some noise and collision checking and this overhead amounts to about half of what your full rate is. So, 54Mbps/2 is about 27Mbps at best, but Wireless G has been seen running at 31Mbps in cleaner environments. Now, this speed assumes that you have a quality signal and not something being ruined by signal propagation, noise, low quality equipment, software settings, or distance (propagation as well).

 

Now, why is that if a radio has two different link speeds, one for send and one for receive, that the speed is reduced when using both directions? Here's where Duplexing comes into place. Now, unless you are using a very fancy and an extremely proprietary setup in 802.11 that allows you to both send and receive data at the same time between two points, each point is either a listener or a receiver. When you download and upload at full speed at the same time, your speed is essentially cut in half. Going back to the example of being a listener, or being a sender, when you perform such an activity both at the same time, you have to become a listener, synchronize, send, synchronize, and then repeat until the task is done to where you can "sleep," head back to strictly listening or strictly receiving. Granted, it doesn't quite work this simple, but it's the concept. It's similar to what happens when you have Half Duplex Ethernet.

 

Granted, there are also methods too that can be used to eliminate the issue of Duplexing, such as "No Ack" mode found in higher level protocols such as TCP/IP and within the 802.11 standard, but keep in mind Acks are a handy part of reliability. They are used to signal whether data was received correctly, or was damaged/not received at all. If no Acknowledgements (ACKs) are sent out by the other end, which the sending host then has to listen to after a set amount of time of talking, data if damaged is wasted and at a specific point in time the sender must re-send the same message. Think about someone with poor hearing. You can say a sentence and not acknowledge each word, but bets are off as to whether or not they received and understood the message. If they say "What?" then you have to re-send the message again, and again until they understand it. Now, if that person has to acknowledge each word, the message is delivered, slower, but more reliably.

 

Wi-Fi is very similar to this. Play with this sometime!

Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 5 of 34
(159,348 Views)

Now here's another factor that needs to be keyed into. Channel width, and this was as mentioned prior a concept introduced heavily in Wireless G as a proprietary method, and now Wireless N and AC as a part of the standard. Now you may ask, what does channel width have to do with this? Well, earlier I mentioned that 2.4Ghz has 60Mhz of spectrum that can be used in the US due to regulation, and there are three non-overlapping channels, 1, 6 and 11. Between each of these channels you have 10Mhz of spectrum which contains the overlapping channels 2, 3, 4, 5, and 7, 8, 9, and 10 respectively. By default in Wi-Fi, each channel is 20Mhz wide meaning on each side of Channel 6 for example, you have 10Mhz of spectrum allocated by design. Here's a pretty picture to describe this concent further:

 

 

2Mhz between each channel is deened "padidng" space hence the reference for 22Mhz wide channels.None the less, the concept is 20Mhz per channel.

 

Now, without forgetting anything else that has been talked about regarding noise and other devices, let's throw in Wireless N. As I mentioned in my post decribing how some networks show a very high rate, this is because they're using a Double wide channel, otherwise known as a 40Mhz wide channel. These channels occupy 4Mhz +2Mhz of space, and can range from the space that Channel 1 uses all the way up to the range that Channel 6 ends at, heading towards Channel 11. Now, this can give you great performance, but in high density or high noise locations such as mine, larger channels are not always better. When you use a larger channel, the effects of collisions and noise are greatly increased due to the number of devices that are already using the spectrum. What does this of course result in? Unreliability, and in another twist of events, reduced range over a 20Mhz channel! In some cases, especially back in the days of Draft N, using wider channels often times wound up jamming other nearby Wi-Fi networks along with your own network and this was simply due to interfernece and a "crunch for spectrum." Also, even if a 40Mhz channel may seem to give you more speed, in noisier environments 40Mhz can actually DECREASE your speed due to reliability issues (see the last post) over a 20Mhz channel.

 

Per the Wireless N standard, any router that operates on the 2.4Ghz band and is capable of using a 40Mhz wide channel must fall back to 20Mhz if any interference is detected. This is to cut down on events such as what I am seeing here where Wi-Fi performance is degraded, maybe not so much range, but that has also gone down over the years too as more devices and networks are set up. Unfortunately, many folks in an effort to get more speed, set up their router with a 40Mhz channel being forced on 2.4ghz and as a result, this causes the issues I mentioned above. Most routers certified with Wireless N should use 20Mhz automatically if left in the auto position, but this is also dependant on firmware.

 

40Mhz channels however, are safe to use in the 5Ghz band which lacks the amount of interference seen on the 2.4Ghz band. While you do have less range in 5Ghz, the abundancy of spectrum and the fact that 5Ghz doesn't penetrate through objects and air as well is very handy. Most routers still abide to the auto-fallback in a 5Ghz environment however in most cases it is not needed and a very high density setup with 40Mhz channels can be accomplished with ease, and you won't be given the cone of shame for using wider channels :-)

 

In the Wireless AC standard in an effort to push 1+Gbps of data Wirelessly, 80Mhz channels will start to be used. This is a huge amount of spectrum that cannot be utilized in the 2.4Ghz band, and similar to Wireless N it currently has the same limitations in regard to spectrum.

Platinum Contributor I
Platinum Contributor I
Posts: 5,881
Registered: ‎07-22-2009
What a great post. Certainly sticky worthy. Kudos sir!
Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010

With the introduction of Wireless N, you now have a concept of making better use of your spectrum through new methods such as Spatial Streams and Beam Forming. Spatial Streams are what you can consider a flat, straight highway with a certain number of lanes, whether it is one lane or four lanes. Each direction, send or receive can have it's own number of lanes at any given time. in the world of Wireless, each "lane" is considered an antenna and what can be used to receive or send a signal (keep in mind we're still dealing with duplexing here!). Now, take the same highway and divide it into even pieces, such as one piece being 2 miles or 4 miles (20Mhz or 40Mhz). The size of each equally sized piece is our channel width in the world of Wireless. Now in each of these pieces, we can fit a certain number of identical cars into each section back to back that cannot change lanes. These cars represent data. With more lanes, you have more capacity for a given amount of distance, but you also have higher costs.

 

This is essentially the concept of Spatial streams. Wireless N implements this to make better use of what spectrum it has to use by combining the power of multiple sending antennas, and multiple receiving antennas to increase the capacity. With this, radios that support higher speeds, and thus require the use of additional antennas also cost much more than your standard issue radio, similar to how your standard issue Big Store computer for Microsoft Word costs less than a computer custom designed to run games and intensive applications or why a Corvette costs more than a Civic.

 

With Wireless N, this greatly increases the amount of capacity you can deliver using established methods of coding data (Wireless G and N use the same methods; I'm not using Wireless B here). However, with greater speed also comes reliabiliy issues. When noise or distance is introduced, data rates will fall substantially but will be maintained more effectively with additional streams or "lanes." Think about this as a highway that is bumpy or hilly. As the vehicles, all moving at the same speed, in the same section and not slowing down what so ever reach a bump or a hill, they are "disturbed" all at the same time and as a result all have to slow down. In the process, some cars are damaged from the sudden stop, causing data to be lost, but it still continues to move along.

 

While a horribly worded analogy, the point is the same. Spatial streams are a huge benefit for speed, and higher end radios achieve much better performnace in many circumstances, however speeds will drop at a substantial rate. You will notice this in a Wireless N environment especially, where a 300Mbps rate cannot be established solidly in some cases, or the speed fluctuates far more than a standard such as Wireless G fluctuated in speed.

Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 8 of 34
(159,320 Views)

One factor that heavily influences your wireless signal quality would be the quality of your antennas. This may be something you can specifically relate to if you've wondered why a Smart Phone or especially a portable game console such as Sony's Playstation Portable is unable to get a signal whereas a laptop can. This is of course a factor of the Antennas and Radios at play.

 

Let's start off with antennas first. Antennas come in all sorts of shapes, sizes, and designs and are tuned for the job they are meant for. In many modern day routers for example, External antennas are not very common and the antenna is instead a silicon microchip with electrical circuits meant to act as antenna. Older routers, such as the common Linksys WRT54G or the ActionTec routers supplied by Verizon can include these internal PCB-based antennas but they also include external antennas that can be swapped out by the user in most cases. In many cases, the External antennas tend to outperform the Internally housed antennas, especially with 5Ghz range and performance however this does not mean they are all bad. Quality of the antenna among all things affects how well the signal is delivered or received.

 

Size is a factor of a good antenna. In smaller devices such as USB Wireless dongles or Smartphones, a small antenna or two are often included in the package. While they are perfectly fine antennas, they often times have a harder time picking up a signal or sending a signal to a destination and as a result are unreliable in more situations. Larger antennas tend to have a much better time with sending and receiving, up to a certain point of course, The connection to the antenna is also quite important. Between the Wireless card and the Antenna end, you typically have either an electrical PCB trace or a copper wire, and in some cases the connection is soldered directly to the Wireless radio. The connection ideally should be sealed so that additional noise or interference is not introduced by or to the antenna The quality of the wire if this is not a directly soldered antenna can make or break your signal if the connection is creating attenuation as well. Gain is also another factor that goes into an Antenna and can make or break your signal as well. Too much gain can result in a very strong signal, but one that is noisy. Too little gain can result in the exact opposite.

 

In some applications, you may find individuals talking about Omnidirectional vs. Directional antennas and each serve their own purpose. Omnidirectional antennas are considered "One to Everywhere" antennas, whereas Directional antennas establish a focus from one point to a specific range or area. In many cases, directional antennas are capable of sending and receiving signals more reliably to or from a further away loction than an omnidirectional antenna is.

 

The key to antenna is not to worry about it too much, since what is included often works well in most applications, but if you are a tinkerer or are genuinely having issues with something, it may not hurt to try swapping antennas.

 

Onto radios, if you look around the market, you'll find plenty of radios available from many manufacturers. Intel, Broadcom, Atheros, Ralink, and Realtek are to name a few. Each manufacturer caters themselves to specific individuals and designs their radios with specific standards in mind. While radios can be considered nearly identical in performance as most tend to be, some radios, similar to how some computers (let's go Intel vs. AMD here) work better in certain situations in comparison to others, tend to work better in noisier environments, or in quieter wireless environments, or in more distant or near environments compared to others. While some of this ties into the quality of your antenna, the quality of the radio does determine how well your Wireless signal operates.

 

In my own experience, NICs from Intel and Broadcom's higher end tend to be quite powerful and perform well. Atheros works well too once you get into their higher end chips. Realtek and Ralink tend to be lower end chips but this doesn't mean that they are bad! Again though, typically stock works well in many applications but this is a matter of pairing and seeing what works, what doesn't work well. Any change you make can and will affect how well your Wireless network operates. On one final note for this post, the connection the radio uses to talk to the rest of the device also tends to make a difference when it comes to power and performance. In the case of desktop/laptop computers, you will find that by using an Internal PCI/PCI-e Wireless card over a USB card, signal quality and performance will be much better. This is due to form factor AND power, among others that limits the radio design.

Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 9 of 34
(159,317 Views)

As mentioned in my first post, much of the 802.11 standard was designed around compatibility for the sake of everyone, so that devices are not obselete and rendered useless after a few years (Computer folks care about the environment, too!). With that in mind, Wireless N on 2.4Ghz supports Wireless G devices, and also legacy Wireless B devices. Wireless AC supports Wireless N, but also supports Wireless A as well which all operate in the 5Ghz frequency. There are however trade-offs to this. With Wireless, your performance is as slow as the weakest link, and as a result many protections have needed to be built into the standard to reduce problems that happen when older standards are used on newer radios supporting new standards.

 

To explain the background a bit, back when Wireless G was born, a new moduation standard called OFDMwhich allows for higher performance and high collision reduction within the same amount of spectrum, and it is what Wireless N and A also use due to it's benefits. Wireless B devices uses a moduation known as DSSS, which is an older, more legacy method to avoiding collisions from occurring on a Wireless network. During the birth of Wireless G while it was in the Draft stage, many observed that Wireless B devices and networks were "jammed" in the presence of Wireless G devices. This was due to the fact that the two different technologies were competing for the same spectrum, but also due to the fact that the two technologies also had two completely different methods for avoiding collisions. As a result, you name it, collisions happened.

 

To resolve this, Wireless G and newer standards have built-in protections that allow for older, legacy networks to persist. These protections add in some overhead to Wireless communicationand slightly reduce speeds overall, but they are minor in a reduction. The problem however, does not lay with the overhead but with what happens when you add older devices, such as a Wireless B device or a G device to a Wireless N network and this is where backwards compatibility has a downfall.

 

When you connect up a Wireless G device to a Wireless N access point, your max data rate will be 54Mbps in both directions, obeying duplex and protocol overhead as mentioned earlier. For as long as you are on this access point and remain idle, any newer Wireless N devices can transmit at the speed of which they can communicate at with the radio on "the other end." However, the moment you begin to talk, Wireless N devices begin to see a small decrease in performance. This is due to some of the backwards compatibility Wireless N has built-in to allow Wireless G devices to still work on that access point. Here is where things become more challenging. Wireless B devices because they use a completely different scheme for modulation, DSSS instead of OFDM cause backwards compatibility to kick the network to some rather extreme settings for as long as that device is active on the Wi-Fi network.

 

Before I go further, let's talk once more about how Wireless is a shared medium and is quite idential to a wired hub (don't confuse hubs for switches! They are different). On a hub, not only can one person only talk at a time, but you all must share a certain amount of data, say 10Mbps for a given time. If you have 10 computers all talking at once, each computer gets 1Mbps data rates no matter where the data has to go. Add in duplexing since most hubs are half duplex, you get half of what your potential max speed is. On a switch, you could still have your 10Mbps ports, however unless all of the computers are trying to talk to a computer or network on a single port, all devices on the same port get 10Mbps regardless. Duplexing can also be set to Full and allow for two way communication.

 

So, what does this have to do with Wireless? Wireless is as I stated, identical to a hub. All devices attached to an access point must share the max capacity of the access point that they are connected to and must obey duplexing and "wait your turn" rules in the standard. If an access point is Wireless N and operates at 300Mbps, overhead and duplexing will take away roughly 30% of that 300Mbps right off the bat. What you have left over is what ALL Devices must share together. So, if you have 15 devices all talking at once, in an ideal situation where your usable data is 200Mbps, each device can only use 13.3Mbps, though in most cases it doesn't work as even as this; one device can monopolize bandwidth.

 

So, back to the discussion about Backwards compatibility. Using a Wireless N access point at 300Mbps, connecting a Wireless G device will not drop link speeds for any clients, but it will cause transfer rates to fall as the radio has to adjust under the hood transmission rules so that the Wireless G device can communicate with the other devices. Now, when you add in a Wireless B device, the radio is forced to not only adjust for a very old collision detection protocol, but it must reduce link speed to the point where a Wireless B device can be understood. This means your 300Mbps-30% is now being reduced to 11Mbps-30% at a minimum. Now, this 11Mbps rate that you have on Wireless N due to a single B device now has to be shared across all devices. Using 15 devices once more, 11Mbps - 30% is about 7.7Mbps. Assuming all things equal, this gives all devices talking at once 0.51Mbps!

 

This is key when building a network. Do you want performance or compatibility? In many cases, older B devices can successfully be blocked from connecting to the network due to how uncommon they are starting to become as such devices are being retired from use or break down. This can prevent your network from suddenly bombing out if you are in a very high bandwidth home that extensively relies on Wireless.

Platinum Contributor II Platinum Contributor II
Platinum Contributor II
Posts: 7,261
Registered: ‎12-15-2010
Message 10 of 34
(159,300 Views)

Encryption is a constant debate in many forums about which standard to use, why you should use a specific standard, whether it's compatible with devices or why one type runs quicker than others. It is no joke: Encryption can determine how well your Wireless network operates.

 

Today, you have the choice between many encryption options as a home user. No encryption, WEP 64-bit Encryption, WEP 128-bit Encrption, WPA TKIP, WPA AES, WPA2 TKIP, and WPA2 AES encryption. In an enterprise or enthusiast, or otherwise "paranoid" environment you can have the above along with further encryption known as RADIUS and ciphers. Each level of encryption has been introduced with each coming Wireless standard. No encryption, WEP, and WPA were introduced during the days of Wireless B, with WEP being common due to wide support. WPA2 was introduced during the days of Wireless A and G being introduced, and WEP remained king but WPA grew out of the use of Wireless Protected setup (a massive security hole and now widely exploited).

 

The strongest level of encryption to use today as a home user is WPA2 AES encryption, and this is currently deemed uncrackable except by brute force, which requires super computing power of essentially quantum computer levels or a mass farm of GPUs running a cracking program just to break a strong key. WPA hsa known exploits as mentioned above, and WEP can be cracked in seconds due to how weak it is.

 

Now, what does the encryption have to do with speed? First of all, this is a matter of what the encryption entails and how it is done. WEP and WPA encryption for example are CPU heavy. In a high performance situation, WEP security will outpace WPA security due to how much more compex WPA is as an encryption standard set, however at a big cost of security. WPA will be more secure for less performance, but you will also have less compatibility with it amongst B Devices (G devices should have no problem, they're required to all support it). On low end or very old access points, WPA and WEP encryption in a high bandwidth environment can cause your Wireless speed to be slowed, especially as the number of devices increase due to processor load, and this can eventually lead to crashing from heat or excessive CPU usage on such device.

 

WPA2, which was introduced during the Wireless G days is a much more complex form of encryption utilizing AES encryption, and this is compared to TKIP which is what WPA uses. However, WPA2 tends to operate much faster despite being more complex Why is this? In modern chipsets, WPA2 encryption and decryption doesn't so much rely on the processor of your router or access point, but is instead performed on the same chip running your Wireless radio, so essentially it's a process that is done by dedicated hardware. This means to you, very fast, and very strong encryption. In older chips that had WPA2 support added to them by software, as seen in the ActionTec MI424WR Rev. A Verizon handed out in the early days of FiOS, WPA2 encryption was performed in software rather than in dedicated hardware and as a result, your access point or router (or a very slow computer) often ground to a halt trying to encrypt and decrypy wireless traffic on the processor of the router.

 

Let's add in another deal here: Wireless N and Wireless AC. By definition of the standard, Wireless N will not operate any Wireless N capabilities, such as improved speeds or Spatial streams unless the encryption is set to WPA2 mode. Meaning, if you have any device that connects to the network using WPA or WEP encryption, you are basically rendering Wireless N useless and turning your network back into a Wireless G network, or maybe a B network, so you will instead of connecting at 300Mbps as an example, be forced to connect at no more than 54Mbps. Even more broad, this means your Wireless will run slower. Draft Wireless N gear may not obey this, but keep this in mind. Wireless N requires WPA2 AES security and cannot use any other encryption. Wireless N however, will function absolutelyt fine with no encryption if you wish to run like that, which I don't recommend Smiley Happy

 

Some side notes: If you own an ActionTec MI424WR Rev. F, G, or I from Verizon, absolutely set the router to only use WPA2 AES mode only. Otherwise you're throwing away some good capabilities you were given. It saves everyone time and headaches trying to figure out why the Wi-Fi is running slower when it's as simple as the encryption. If you have any other router from Verizon besides a D-Link DI-524 or older, or an ActionTec MI424WR Rev. A, both of which should be retired due to age ideally, set up WPA2 anyways as it's a gift Smiley Happy

How-To Videos
 
The following videos were produced by users like you!
   
Videos are subject to the Verizon Fios Community Terms of Service and User Guidelines and contains content that is not created by Verizon.
My Fios App

Verizon Troubleshooters
Unable to find your answer here? Try searching Verizon Troubleshooters for more options.
 

My Verizon

  • Add or Change Plan
  • Suspend My Service
  • Apps

Support

Watch Fios