|
|||||||||
Evaluating Wideband 802.11 WLAN Radio Performance
Evaluating Wideband 802.11 WLAN Radio Performance Although the original intent of the IEEE 802.11 specification was to yield low cost products with a unified frequency plan, capable of being marketed on a worldwide basis, the popularity of Wi-Fi has recently caused the creation of an array of unique country-specific channels in the 5 GHz band. These channel additions have spawned a new class of wide bandwidth dual-band 802.11 network interface cards (NICs) that cover a frequency range from 4.9 to 5.9 GHz in high band (i.e. a 20 percent bandwidth) in addition to the standard 802.11b/g band of 2.412 to 2.484 GHz. The devices also support a variety of modulation modes ranging from the legacy BPSK/QPSK (1- and 2-Mbit/s data rates) to complex state-of-the-art 64 QAM OFDM (54-Mbit/s data rate). As such, these products will communicate with any 802.11 access point, even those original devices produced in the early '90s. The products therefore have strong appeal to IT Managers because they allow IT managers to not worry about compatibility issues and the NIC will always automatically communicate at the highest possible data rate. Needless to say, wideband WLAN products present some unique engineering challenges in their design. Some of the bigger challenges include antenna design, quality control testing, and throughout testing. Let's look at these three issues in more detail starting with antenna design. Designing Antennas There are actually two different scenarios. One class of products, for example Cardbus and USB NICs, usually contain built-in antennas. Conversely, embedded NICs such as mini-PCI devices are usually mounted inside of a laptop and the computer manufacturer provides the physical antenna(s). In this latter case, the computer manufacturer must ensure compatibility between the devices. A typical symptom caused by sub-standard antenna performance is reduced range. In an extreme case, throughput in the 54-Mbit/s modulation mode (64QAM modulation) may also be degraded due to distortion. This translates to reduced error vector magnitude (EVM), caused by non-optimal loading of the power amplifier (PA) in the NIC by the antenna. To ensure good radiation efficiency, VSWR should be less than 2:1. This would allow the real part of the antenna impedance to fall anywhere between 25 and 100 ohms, these values being the intersections of the 2:1 VSWR circle on the Smith Chart with the real axis. However, to ensure low distortion with complex 64QAM modulation, we must typically also be concerned with the bounds on the actual antenna impedance reflected into the plane of the PA. For example, if the antenna impedance is substantially lower than 50 ohms, distortion may be increased. Measurement of the VSWR of the antenna is most easily accomplished with the aid of a vector network analyzer (VNA). If the antenna is built into the NIC, it will be necessary to open the unit, interrupt the stripline feeding the antenna and solder a short length of semi-rigid 50-ohm coax to this point (a "pipe" in the jargon of RF engineers). It is also important to replace the entire housing surrounding the antenna and to then plug the entire NIC back into the computerthese factors all affect the antenna impedance. This is most easily accomplished by filing a small notch in the housing to permit routing of the pipe outside the NIC. The computer should then be placed on a typical tabletop since this will also have a secondary effect on the impedance. In the case of antennas mounted external to the NIC (e.g. mini-PCI cards), a suitable 50-ohm between-series coaxial adapter should be used to mate the antenna connector employed in the laptop to an SMA family. The VSWR should be measured with the lid of the laptop in its normal operating position, typically at an angle of 135 deg. from the horizontal. In this case, surrounding objects usually do not influence the antenna VSWR as much as in the built-in case since the antennas are usually mounted in the lid of the laptop well removed from the tabletop. Quality Control vs. Test Time Since we would like to run the maximum possible RF power output from a radio, each of these modes requires a different degree of backoff from the PA's 1-dB compression point. For example in OFDM BPSK, CCK, or legacy modes, the modulation constellation is relatively simple and, using the 1-dB compression point as a reference, the PA is typically run approximately 1 dB further backed off in power from this setting. In the most complex 64QAM modulation mode, however, considerably more linearity is necessary to preserve the fidelity of the signal and the PA is typically backed-off 4 to 5 dB from its 1-dB compression point. Stating this another way, we're really keeping the peak power output of the PA constant. The peak-to-average power ratio (PAPR) of the signal is however much higher in 64QAM mode as compared to CCK (13 dB vs. 2.7 dB). This is the price we pay for more complex modulation formats and higher data rates. A critical balance must therefore be achieved between having a high probability of disclosing production problems in a radio and providing a reasonable test time. The cost of a typical test equipment setup capable of automatically testing a dual-band Wi-Fi radio is substantial. Minimizing test time is therefore critical. On the other hand, a dual-band NIC covers a 20-percent bandwidth at 5 GHz and performance parameters may therefore change with frequency. Therefore, an obvious question to address is the necessary quantity of channels to test. The answer, of course may be quite dependent on the both the inherent broadband nature of the chipset employed as well as the quality of the radio design itself. A good production test suite should therefore allow the manufacturer of the NIC to customize this choice and then employ a relatively sophisticated best-fit curve to fill in missing data on the in-between channels. From experience, a good conservative starting point for selecting channels would be to use four channels on low band, equally spaced over the frequency range (e.g. channels 1, 6, 11 and 14). On high band, due to the substantially wider frequency range, a good starting point would be eight channels evenly distributed throughout the frequency band. The suite should also have a certainty in the high 90-percent range of disclosing a faulty radio. Using these parameters, it should be possible to achieve a test time in the neighborhood of two minutes for a full suite of tests. Another subject for discussion is whether or not production EVM measurements are necessary. A vector signal analyzer (VSA) capable of performing accurate and repeatable EVM measurements on a modulation constellation is fairly pricey. As an alternate to EVM measurement, for a particular PA, there is a strong statistical correlation between its modulation mask and EVM. It is therefore fairly simple to test a large sample of radios, manually adjusting the drive to produce a desired EVM and then noting the modulation mask corresponding to this point. We've now replaced a costly VSA measurement with a measurement that can be performed on a standard spectrum analyzer, a device that is already needed for other measurements such as transmitter harmonics. In addition to production testing where test time is paramount, a small sample of radios should be subjected to a more comprehensive battery of quality control tests including a correlation of EVM vs. modulation mask. In this manner, we can minimize the cost of manufacture of these products while maintaining reasonable quality standards. Throughput Testing There are two aspects to throughput testing: close-in throughput and range. Close-in throughput quantifies performance when the client NIC is in relatively close proximity to an access point. Under these conditions, the signal-to-noise ratio (SNR) is large and throughput performance is maximized since most packets should execute at the 54-Mbit/s data rate. As the distance between the AP and client increases, the SNR will fall off and a point will be reached where elevated packet error rate (PER) will occur, forcing significant packet retries. At this point, the client NIC and access point will execute a complex rate switching algorithm wherein the system will try a combination of lower data rates and fragmentation of packets in order to optimize throughput. This will result in a gradual reduction in throughput as the distance increases. At some far distance, however, the data transfer becomes very low since the client and access point have transitioned to their lowest possible data rate. In fact, a little beyond this point, the client may no longer remain associated with the access point. This maximum useable distance is defined as the range. In a practical network, at some point prior to the maximum range, the client will search for and hopefully find a more favorable AP. It will then be seamlessly handed off to it without any interruption of data flow. Testing in the actual hallways of a typical crowded office building is usually used to quantify the throughput performance of a WLAN. For convenience, however, it is often desirable to have a compact lab set-up to simulate this performance. Figure 1 shows a test set-up for measuring the radiated throughput of a client NIC. A radiated test set-up, although a bit more elaborate to construct than a conducted version, has the advantage of including the performance of the client's antenna. In this radiated set-up, it is important that both the AP and the client be housed in well-shielded boxes. All control and power cables entering or exiting the boxes should be well filtered to prevent their conducting possible interfering signals into the set-up. In addition, the box housing the client device under test (DUT) should be lined with a suitable radio absorbing material (RAM) to minimize reflections inside the box. The antenna may be any convenient device made for the frequency range of interest (e.g. a rubber coated dipole). The range of the RF attenuator should be adequate to ensure that the signal can be attenuated below the sensitivity threshold of the DUT. If the DUT has built-in antennas, reasonably accurate calibration can be attained by temporarily disconnecting the DUT's antenna from the rest of the WLAN circuitry at a point after any impedance matching networks and then connecting a short length of "pipe" to this point. A spectrum analyzer may then be used to log signal strength vs. attenuator setting. This calibration will however typically vary significantly with frequency. Several different software suites for throughput testing are readily available. These are typically installed at each end of the link. In operation, they send long file transfers over the link while metering the necessary time to complete the job. One comment, however: The various software packages vary somewhat as to the type of packets sent (e.g. UDP or TCP/IP) and whether or not all layers of Windows are used. Absolute results therefore are not totally comparable between the various test suites. About the Author
|
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |