What Is the 802.11a Standard?

Learn how this standard affected wireless networking

802.11a was one of the first Wi-Fi communication standards created in the IEEE 802.11 standards family. It is often mentioned in relation to other standards that came later, such as 802.11b/g/n and 802.11ac. Knowing that they're different is useful when buying a new router or connecting new devices to an old network that might not support new tech.

802.11a wireless technology should not be confused with 802.11ac, a much newer and more advanced standard.

Wi-Fi router back with antenna and Ethernet cable plugged in
Getty Images

Relationship Between 802.11a and 802.11b

The original IEEE designations have been renamed to avoid confusion among consumers. Although their new designations are unofficial, 802.11b is referred to as Wi-Fi 1, while 802.11a is called Wi-Fi 2. This new naming structure, introduced in 2018, currently extends to Wi-Fi 6, which is the official designation for 802.11ax, the fastest and most recent technology.

802.11a and 802.11b were developed at about the same time. 802.11b enjoyed faster acceptance because its implementation was more affordable. They use different frequencies, so they are incompatible. 802.11a found a niche in businesses, while the less expensive 802.11b became standard in homes.

802.11a History

The 802.11a specification was ratified in 1999. At that time, the only other Wi-Fi technology being readied for the market was 802.11b. The original 802.11 did not gain widespread deployment due to its excessively slow speed.

802.11a and the other standards were incompatible, meaning that 802.11a devices couldn't communicate with the other kinds and vice-versa.

An 802.11a Wi-Fi network supports a maximum theoretical bandwidth of 54 Mbps, substantially better than the 11 Mbps of 802.11b and on par with what 802.11g would offer a few years later. The performance of 802.11a made it an attractive technology but achieving that level of performance required using relatively expensive hardware.

802.11a gained some adoption in corporate network environments where cost was less of an issue. Meanwhile, 802.11b and early home networking exploded in popularity during the same period.

802.11b and then 802.11g (802.11b/g) networks dominated the industry within a few years. Some manufacturers built devices with both A and G radios integrated so that they could support either standard on so-called a/b/g networks, although these were less common as relatively few A client devices existed.

Eventually, 802.11a Wi-Fi phased out of the market in favor of newer wireless standards. 

802.11a and Wireless Signaling

U.S. government regulators in the 1980s opened three specific wireless frequency bands for public use: 900 MHz (0.9 GHz), 2.4 GHz, and 5.8 GHz (sometimes called 5 GHz). 900 MHz proved too low of a frequency to be useful for data networking, although cordless phones used it widely.

802.11a transmits wireless spread spectrum radio signals in the 5.8 GHz frequency range. This band was regulated in the U.S. and many countries for a long time, meaning that 802.11a Wi-Fi networks did not have to contend with signal interference from other kinds of transmitting devices. 

802.11b networks used frequencies in the often unregulated 2.4 GHz range and were much more susceptible to radio interference from other devices.

Issues With 802.11a Wi-Fi Networks

Although it helps improve network performance and reduce interference, the signal range of 802.11a is limited by the use of 5 GHz frequencies. An 802.11a access point transmitter covers less than one-fourth the area of a comparable 802.11b/g unit.

Brick walls and other obstructions affect 802.11a wireless networks to a greater degree than they do comparable 802.11b/g networks.

Was this page helpful?