Phil MeddNow that the first LTE networks have gone live, the race is on to provide mobile broadband services on a large scale. However, have all the technical issues been addressed, or are there problems still to be overcome before subscribers gain the full benefit of next-generation cellular technology? Beyond the technical issues, there are also some broader aspects of the LTE service offering that still require resolution. This article takes a look at some of these issues in the context of the challenges they present to the test engineer.

Before looking at some of the technical issues related to the design of user equipment (UE), a wider view reveals other potential pitfalls: 

• At the time of writing, formal certification of LTE devices has not yet begun. The major certification bodies (GCF [Ref.1], PTCRB [Ref.2]) are working to introduce conformance testing schemes for protocol, RF and radio resource management, with a target of December 2010. However, with devices already on sale in some markets, this leads to the question: will these devices be capable of passing the conformance tests once they are introduced? 

• With LTE Category 3 devices capable of supporting high data rates (100Mbps in the downlink, 50Mbps uplink), will the backhaul capacity be sufficient to cope? In the longer term, as the number of LTE users rises, sharing of bandwidth on the radio network between all users in the cell will become a significant factor, with crowded cells not performing so well. Also, as the number of active users rises, cell-edge performance will suffer due to a higher SNR. 

• With a potentially global mobile data network, the expectation of global roaming for data users needs to be addressed. Although it is technically possible, the cost of roaming data services for consumers needs to be addressed. On the other hand, flat-rate data plans are already an issue for network operators, with effectively fixed revenues being earned for providing variable,and almost certainly increasing, data volumes. 

• With LTE being the next-generation technology choice for CDMA2000 network operators (eg Verizon Wireless), interworking with 3GPP2 CDMA2000 high-rate packet data services is a definite requirement. Merging the 3GPP and 3GPP2 network topologies at the LTE radio network interface is an interesting development that will require careful testing to ensure it operates as expected. 

• Maintaining voice services using an IP network in parallel with the circuit-switched legacy networks will be a challenge for the network operators. The agreement announced at Mobile World Congress 2010 by the major network operators to standardize on VoLTE (Voice over LTE) will largely address this issue. However, this technology, which uses 3GPP’s IMS (IP Multimedia Subsystem) still needs to be deployed on a large scale.

Meeting the Challenge
To match the demanding requirements of LTE terminal devices, it is essential to break the design down into subsystems and to build a test plan that allows each part of the design to be characterized thoroughly before testing the complete device. Without this modular approach, the diagnosis of problems can occur so late in the program that it becomes difficult to manage the final release stages, including field trials and compliance testing.

Measurement Needs
Regardless of whether the device design is begun from scratch, evolved from an earlier design, or uses third-party component integration, several key performance measurements need to be made. Some of these, such as maximum output power, power control, and receiver sensitivity, will be familiar from earlier technologies, but due to the transmission schemes used (OFDMA in the downlink, SC-FDMA in the uplink), new measurement equipment will be needed to support these tests.

Other measurements are specific to LTE. With its OFDMA transmission scheme, for example, error vector magnitude (EVM) per sub-carrier becomes an essential test of modulator performance. With the availability of the 700-MHz analog TV spectrum, LTE will be deployed at lower frequencies than GSM or WCDMA, resulting in much broader bandwidths: 20 MHz/700 MHz = 2.8%, compared with 5 MHz/2100 MHz = 0.24% for typical WCDMA devices. This can pose a challenge with some modulator architectures as it results in a higher EVM at the band edges, so it needs special attention at the design stage.

Due to the dynamic nature of some of the tests, such as power control, the measurement conditions need to be established using the signaling protocol. This makes it essential for the test equipment to include the protocol stack, simulating the evolved Node B (eNB) base station. Since these measurements are usually performed by RF engineers rather than protocol specialists, the test equipment used must be simple to configure, allowing engineers to focus on the measurement being made.

Protocol Testing
One of the main challenges for the protocol stack developer will be to ensure that the state change response requirements are met. Although the LTE specifications have reduced the number of states that a terminal can be in to RRC_IDLE and RRC_CONNECTED, the time it takes to change from one to the other will be a major part of the latency budget when data needs to be sent.
In RRC_IDLE mode, as much as possible of the device electronics will be in a low-power consumption state to ensure good battery life, with the receiver activated periodically to check for paging messages. When data transmission is scheduled, the device must wake up and rapidly synchronize its uplink.

Protocol testing can often involve expending as much effort on generating test cases as on creating the protocol stack, so access to comprehensive and efficient test facilities is vital. In order to be able to break down the testing, it is important to be able to test each sub-layer in both the User Plane and Control Plane. Protocol test diagnostic features are essential when tracing faults. Typically, this would include time-stamped message logging and decoding. But it is important that this is available for each sub-layer, providing the ability to trace through signaling message flows in detail, from MAC PDUs up to RRC messages, thus ensuring that timing requirements are met.

The ability to create test scenarios for each layer requires detailed control of the test equipment, but this needs to be kept as easy to use as possible to avoid a painful learning curve. Graphical test description, as provided by the Aeroflex 7100 Scenario Wizard, offers the clearest method of defining new tests (Fig. 1).

Aeroflex-Figure 1

Performance Testing
Once the RF, baseband, protocol stack, and application layer have been integrated, overall device performance needs to be fully characterized. During this stage, it will be necessary to trace and eliminate bottlenecks to maximize data throughput, both under normal and extreme conditions of temperature and power supply voltage. Power consumption, thermal characteristics, electromagnetic compatibility (EMC), emissions, and susceptibility also all need to be measured under full load conditions. Generally, this will involve using 2x2 downlink multiple-input multiple-output (MIMO).

The ability to seamlessly hand over between cells while minimizing the interruption to data throughput needs to be assessed, as does the ability to hand over between different radio access technologies while maintaining the data connection. Compact, flexible, and modular instruments are already available from multiple vendors. For example, Aeroflex’s LTE test products support all the features necessary to characterize the performance of LTE devices (Fig. 2).

Aeroflex-Figure 2

Although the LTE physical layer uses a cyclic prefix to add resistance to multipath effects, it needs to be tested to ensure correct operation. Leaving this testing until the field trial stage adds risk to the development. Fortunately, test equipment suppliers provide facilities for simulating real-world signal conditions in the lab, with built-in fading simulators and noise generators.

An important performance parameter of an LTE device will be its ability to achieve and maintain synchronization with the downlink signal. The LTE OFDMA scheme uses subcarriers spaced at 15 kHz intervals. The receiver must stay precisely tuned to the subcarriers even under the effect of Doppler shift. Lack of synchronization results in inter-subcarrier interference, reducing SNR. To characterize the behavior of the device, the ability to simulate Doppler shift in the lab is again essential.

The next generation of mobile devices will need to provide a mobile broadband experience that matches the hopes and expectations of the network operators. It will be necessary to test new LTE devices using a layer-by-layer approach, building up to an end-to-end test scenario that uses real-world signal conditions. Ensuring that performance is maintained throughout the cell will be the most difficult challenge, especially as the number of users in the cell grows—and with it the signal noise level.

The thorough and efficient testing of LTE devices requires comprehensive test coverage: RF, protocol and system-level. Test equipment vendors are providing this capability with both new and upgraded instruments, test sets, and systems already available.

Achieving high data throughput and low latency in the most efficient manner possible (in terms of both power consumption and RF spectrum usage) is the main aim of introducing LTE technology. Only through careful testing in the development and deployment stages will this goal be achieved.

1. Global Certification Forum