lte_optimization

56
ABSTRACT This document provides a detailed test plan for KPI Optimization of the LTE field technology deployment as detailed in the High Level Test Plan. The tests will be executed in Boston, MA and surrounding areas. The primary objective of the Optimization is to validate basic KPI functionality, evaluate performance of LTE Air-Interface functionalities. The scope of test cases included in this KPI Test Plan spans several areas including access, latency, coverage and Capacity Revision History

Upload: wilson-orellana

Post on 03-Oct-2015

40 views

Category:

Documents


1 download

DESCRIPTION

LTE optimization step for FDD

TRANSCRIPT

LTE Test Plan

ABSTRACTThis document provides a detailed test plan for KPI Optimization of the LTE field technology deployment as detailed in the High Level Test Plan. The tests will be executed in Boston, MA and surrounding areas. The primary objective of the Optimization is to validate basic KPI functionality, evaluate performance of LTE Air-Interface functionalities. The scope of test cases included in this KPI Test Plan spans several areas including access, latency, coverage and CapacityRevision History

Table 1: Revision HistoryGlossary of Terms

DLLS

Down-Link Load SimulatorEAT

Enhanced Analysis Tool

ePC

Enhanced Packet Core

GbE

Gigabit Ethernet

LLDM

LGE Logging and Diagnostic Module

LMT

Local Maintenance ToolMME

Mobility Management EntityMPLS

Multi Protocol Label SwitchingMS

Management Server

NPO

Network Performance Optimization

OAM

Operation, Administration and ManagementOLSM

Open Loop Spatial MultiplexingRAMSESRole-based Access Management Security System RRH

Remote Radio Head

SAE

System Architecture EvolutionTLS

Transparent LAN Service

VLAN

Virtual LAN

Table of Contents

1ABSTRACT

1Revision History

61Introduction

62KPI OPTIMIZATION TARGET

73DEPLOYMENT SYSTEM

73.1Trial System Architecture

103.2Air Interface Overview

103.3Access terminals

103.4LTE eNodeB functions

103.5ALU 7750 Service Router Functions

103.6ALU 7705 Service Aggregation Router Functions

103.7LTE MME Functions

113.8LTE SAE Functions

113.9Data Laptop Configuration

113.10Application Servers

113.11Test tools

123.11.1PDM Tool

123.11.2Enhanced Analysis Tool (EAT)

133.11.3Agilent tool

133.11.4Wireshark

133.11.5SyncroTest

143.11.6Data Analysis Tool (eDAT)

164Process Overview

164.1Deployment Site Locations

164.2Site Readiness

164.2.1Spectrum Clearance Verification

164.2.2Antenna Audit

164.2.3Sector Verification

164.2.4Baseline Existing System

174.3RF Optimization Planning

174.3.1Perform RF Parameter Audit

174.3.2Validate Initial Neighbor Lists

174.3.3Tool Readiness

174.3.4Define Clusters

174.3.5Drive Route Planning

184.4RF Optimization Execution

184.4.1Cluster Optimization

184.4.2System Verification

194.5Test Cases

194.6Single User Throughput Test Peak

194.6.1Single User Downlink Airlink Throughput Peak Test

204.6.2Single User Uplink Airlink Throughput Peak Test

224.6.3Single User Downlink RLC Throughput Peak Test

244.6.4Single User Uplink RLC Throughput Peak Test

274.6.5Single User Downlink Physical Layer Throughput Mean

294.7Best Effort Sector Throughput Tests

294.7.1Downlink Best Effort Sector Throughput

304.7.2Uplink Best Effort Sector Throughput

314.7.3Uplink MU-MIMO Sector Throughput

324.8Downlink Scheduler

334.9Uplink Scheduler

344.10Latency C-plane

354.11Latency U-plane

364.12Quality of Service

384.13Coverage Testing

394.14Handover

404.15V-Pol vs. Cross-Pol

List of Tables

1Table 1: Revision History

List of Figures8Figure 1: LTE Deployment Network Architecture

13Figure 2: EAT configuration in LTE Trial

14Figure 3: SyncroTest Architecture

15Figure 4: eDAT LTE Trial Configuration

1 Introduction2 KPI OPTIMIZATION TARGETCategorySub-CategoryScopeTarget Value

Performance-AccessibilityRRC Setup Failure RateC0.70%

Attach Failure RateL2.50%

Attach DelayL2 seconds

Service Request Failure RateC2.50%

Service Request DelayL0.5 second

Dedicated Bearer Activation Failure RateN1.50%

Dedicated Bearer Activation DelayN0.5 seconds

Performance-RetainabilityDedicated Bearer Drop RateN1.20%

Context DropN1.20%

RRC DropC1.20%

Performance-IntegrityAccess RACH LatencyL0.5 seconds

DL/UL Physical Layer Throughput, peakL60/20 Mbps

DL/UL RLC Throughput, peakL55/18 Mbps

DL/UL Physical Layer Throughput, medianC7/3 Mbps

DL/UL Physical Layer Throughput, 5th %-ileR1/0.5 Mbps

RLC ARQ/HARQ Retransmission RateN1%

Packet Latency (round-trip delay)L30 msec

Performance-MobilityS1/X2 Handover Failure RateC1.20%

S1/X2 Handover Interruption Time, intra-eNBL100 msec

S1/X2 Handover Interruption Time, inter-eNBL100 msec

Intra/Inter-MME TAU Failure RateR2%

Paging Performance R95%

IRAT handover failure rateR

RF-SINRPercent Included Area > 13 dB SINRR10%

Percent Included Area > -5 dB SINRR90%

RF-RSRPPercent Included Area < 143 dB RL OPL (referenced to full-power signal)R90%

3 DEPLOYMENT SYSTEM 3.1 Trial System ArchitectureThis section provides a high level description of the LTE system architecture and a description of all involved entities and interfaces between entities. As shown in Figure 1, the LTE architecture network is composed of:

Multiple eNodeBs

An ePC encompassing two functions: MME and SAE Gateway Metro Ethernet Backhaul

Applications servers

As shown in the figure, the cluster of eNodeBs will be connected via the 7705 and 7750 routers to the other network elements such as the ePC and application servers. Each eNodeB will consist of a D2U and three TRDUs. Each D2U will consist of a uCCMs (controller with interface to backhaul) and three eCEMs (modems; one for each sector). Each eCEM will be connected via fiber optic cable to the TRDU.

Figure 1: LTE Deployment Network Architecture

The information below details the hardware description and the associated OAM equipment list for the Trial network: eNodeB D2U V2 (1 uCCM + 3 eCEM) TRDU (remote-radio-heads comprising of amplifiers and filters), 40W Tx power Backhaul 7750 SR (service router) 7705 SAR (service aggregation router) Transport services Cisco TLS EVC ePC PDN/MME/SAEGW - IPD ATCA Security Cisco PIX LAN switching Module from 7750 Remote support RAMSESIn order to manage the eNodeBs, a complete OAM system has to be designed to host the following functions:

Configuration management

Fault management

Performance management

TracesAdditional OAM systems include:

LMT to configure and set up the D2U platform to commission IP addresses, DLCP server, and default gateway. The LMT connects locally to console port LMT to configure the ePC complex (MME, S-GW and PDN GW) (connects locally to console port)

Management Server (MS) to configure the eNB and display eNB status and fault information Network Performance Optimization (NPO) to collect performance counts and measurements. The MS and the NPO are together referred to as the LTE Management System Server MS and NPO clients to interface to MS and NPO servers Netscreen firewall to protect the LTE network elements from intrusion Netscreen Gate firewall to filter access to the RAMSES Mediation system

RAMSES Remote Access and RAMSES Mediation PC to provide access control and authentication on remote access to the LTE network elements 5620 S/W product managing the monitoring aspects on 77503.2 Air Interface OverviewMain inputs to setup LTE air interface during the Optimization are:

10 MHz spectrum bandwidth in the Upper Band C (700 770 MHz)

Number of frequency carriers: 1

3 sectors per eNB site

3 TRDUs per eNB site SFBC/MIMO in DL and SIMO in UL

Cross pole and Vertical pole antennas 3.3 Access terminals

LGE G Series UEs will be used during the Optimization3.4 LTE eNodeB functions

The eNodeB hosts the following functions:

Functions for Radio Resource Management: Radio Bearer Control, Radio Admission Control, Connection Mobility Control, Dynamic allocation of resources to UEs in both Uplink and Downlink (scheduling)

Routing of user plane data towards SAE gateway

Scheduling and transmission of paging messages (originated from the MME) Scheduling and transmission of broadcast information (originated from the MME or OAM).

Measurement and measurement reporting configuration for mobility and scheduling functions3.5 ALU 7750 Service Router Functions

The SR7750 is an edge router that will host the following functions:

Link aggregation

DSCP mapping

VLAN and LAN switching

IP router to reach different Application servers3.6 ALU 7705 Service Aggregation Router Functions

The ALU 7705 is a Service Aggregation Router (SAR) that offers:

A service-oriented capability to the RAN

IP/MPLS RAN transport solution3.7 LTE MME Functions

The MME hosts the following functions:

Idle mode mobility

S1 connection establishment

Idle to active mode transition

Active to idle mode transition

Session management

QoS control S1 handling during HO

3.8 LTE SAE Functions

The SAE Gateway hosts the following functions:

Multiple bearer support (one default and one dedicated)

S1 GTP-U bearer endpoint

Idle mode handling: bearer suspension with paging request

S1 path switch during handover3.9 Data Laptop Configuration

The access terminal will interface with a data laptop to support Pings, FTP, UDP, and

HTTP data transfers. The laptop should be configured per the recommended parameters to optimize performance and provide an appropriate comparison to existing data. These recommendations include:

Windows 2000 Professional or Windows XP edition

IP header compression (VJ compression) turned OFF

PPP software compression OFF except for HTTP data transfer

128Kbytes TCP window size

MTU of 1500 bytes3.10 Application Servers

Application servers will be used during the Trial to provide the data content for the various tests. These servers will be provided by -Lucent and they need to be easily accessible and not blocked or restricted by low bandwidth pipes.

Windows 2000 Server Edition will be used as a data server, which should reside as close to the PDN gateway as possible. This will eliminate performance uncertainty due to any external network delay. Data applications available through this server will be:

UDP

DOS FTP - TCP/IP Ping 3.11 Test tools

In order to validate functionality and quantify performance, a variety of test tools will be used.

Traffic Generation Tools:

DOS-FTP, WINDS, Ping To generate TCP/IP and UDP based traffic data for measuring data capacity and network latency Ping scripts for access testsLogging tools

Agilent is a Diagnostic Monitor for logging and analyzing over-the-air Network system performance and parameters

Enhanced Analysis Tool (EAT): to collect internal traces generated by the eNBAnalysis tools

Agilent Protocol analyzer

KPI collector and generator

Packet Data Monitoring (PDM) tool: a distributed data performance, analysis and troubleshooting service

3.11.1 PDM Tool

PDM is a distributed data performance analysis and troubleshooting service for packet data, providing end-to-end and per-link data quality analysis. It can be used for the following functions: Vendor independent Packet data network monitoring capabilities

Consistent and automated testing and analysis of packet networks

Characterization of end-user perceived performance in terms of throughput, latency, dropped packets, etc.

Ability to monitor / test packet network to support time sensitive applications, such as VoIP

Identification of links/components requiring maintenance and optimization, and monitoring of link and end-to-end performance

Enables precise data correlation across the entire network

Reduces resource requirements with an automated and remote controlled sniffer system

3.11.2 Enhanced Analysis Tool (EAT)The EAT stores internal traces of up to 15 eNodeBs. EAT runs on a Linux PC which is connected to one or several eNodeBs via Ethernet (Figure 2). It connects to each eNodeB and configures the trace service by providing a destination IP address (i.e. its own IP address) and which traces to activate. EAT does not know the exact moment the traces start, so it has to listen via a socket server. Traces are received via UDP.

Figure 2: EAT configuration in LTE Trial3.11.3 Agilent tool

The Agilent DNA is a protocol analyzer and can be used for user-plane analysis. It offers a scaleable, distributed probing architecture and the raw data is collected at the S1 level. Owing to the true client-server architecture, each user client is able to test independently. To enable detailed protocol analysis, Hardware Intelligent Packet Processing (HI-PI2) at line rate will be required. At the user-plane, the analysis will also require a separation of the signalling and payload packets to be able to correlate and analyze events at both levels.

3.11.4 Wireshark

Wireshark is a network protocol analyzer. Wireshark offers the capabilities to capture network data elements and provide some metrics on the data snooping. 3.11.5 SyncroTest

SyncroTest is an automated test tool to control test mobiles (called Probes) remotely using a central Master Controller console. (See Figure 3: SyncroTest Architecture below) SyncroTest probes control the functionality of LLDM/Agilent, WINDS and FTP to generate traffic and monitor the connections from the probes point-of-view. The log data is then sent back to the master control for analysis. Probes have also been developed to support remote control of EAT so that testing and data collection can be synchronized.

A single SyncroTest master controller can control at least 8 Agilent/LLDM/WINDS probes and an EAT probe simultaneously eliminating the need for an RF engineer in each drive test vehicle. SyncroTest uses self healing TCP connections with each probe to direct the probe and it receives periodic heartbeats from the probe to update the probes status.

Figure 3: SyncroTest Architecture3.11.6 Data Analysis Tool (eDAT)

eDAT is a post-processing tool that allows a user to analyze RF performance KPIs. eDAT takes input from both UE logs and eNodeB logs. eDAT provides a standardized approach to analyze metrics in the Radio Access Network. eDAT uses the UE Logs generated by Agilent/LLDM as input from the UE perspective. The eNodeB logs are generated by EAT and provide an additional input to eDAT to analyze KPIs from the eNodeB perspective. The following diagram illustrates the eDAT configuration for the LTE Deployment.

Figure 4: eDAT LTE Trial ConfigurationOnce the data has been collected, eDAT post-processes the UE and eNodeB log files to generate the analyses of the KPIs. eDAT is a standalone tool that does not need to be connected to the fixed infrastructure. The output includes maps, graphs, plots, reports and message decoding. All these can be used to evaluate the RF performance of the LTE Trial network. eDAT makes use of event timestamps and locations of the UEs to geographically plot out the data.4 Process Overview4.1 Deployment Site Locations

maps4.2 Site ReadinessThe Site Readiness are health checks that ensure all cells are operating as required. These procedures are usually performed after deploying a new network or when introducing new cell sites required for a professional service. Once these health checks have been performed and it a satisfactory performance of all cells can be guaranteed, these health checks are no longer a prerequisite of the RF Optimization.4.2.1 Spectrum Clearance Verification

The spectrum clearance assures that no external interference is present and sufficient guard bands are obeyed.

The detection of the interferences can be a very time consuming and difficult task once the LTE system is up and running. It is desirable to have a very high degree of confidence that the spectrum is cleared prior to any testing.4.2.2 Antenna Audit

This phase involves a series of quality checks to ensure proper installation of the antenna system. The number of audited cell sites will depend on the customer contract. There is a recommended audit minimum of 25% of cell sites in a cluster. The selection of cell sites must be done with input from the customer.

If more than 50% of the audited antennas uncover installation errors, the remaining antennas in the cluster must also be audited. Based on the results and the confidence level of the antenna installations, the percentage of cell sites to be audited may vary for successive clusters. The audit process consist of various inspections on antenna height, antenna azimuth, antenna type, antenna mechanical down-tilt, cable length, etc.4.2.3 Sector Verification

The sector tests include verification of basic call processing functions including origination, termination and handover tests. Measurements are made on LTE signal levels to verify that each sector is transmitting with the appropriate power levels and the Cell id. These basic functions tests are intended to detect hardware -, software-, configuration and parameter errors for each cell site in the cluster prior to further drive testing. Sector drives should be executed for each sector in the system or according to contractual obligations. Due to the simple nature of the drives, sector drives do not require customer approval.

4.2.4 Baseline Existing System

The objective for the Baseline Existing System is to collect the RF performance metrics of the existing LTE system equipment. Baseline driving should be performed prior to any RF Optimization activity and contains measuring of the Key Performance Indicators. Drive routes and Key Performance Indicators will be the same as the ones used later for System Verification. It is important to keep the drive routes and KPIs identical for performance validation and comparison purposes. Drive routes and KPIs must be agreed upon with the customer.4.3 RF Optimization Planning

The Optimization planning phase ensures system and tool readiness for RF Optimization before beginning the actual drive testing.

4.3.1 Perform RF Parameter Audit

RF parameters must be inspected for consistency with the LTE parameter catalogue.The RF parameter settings used in the network can be obtained from the NDP project database. These settings are then audited using the LTE parameter catalogue WPS

4.3.2 Validate Initial Neighbor Lists

An important step within the RF Optimization preparation phase is associated with the neighbor list verification. The complete neighbor lists in the LTE network are required to compare the neighbor relations with network design plots. Neighbor relations need to be verified for recent updates, validity and appropriateness. The recommended strategy is to have a minimum number of neighbor relations in the neighbor lists. The neighbor lists used in the network can also be obtained from the WPS project database.

4.3.3 Tool Readiness

Appropriate drive test tools and post-processing tools, need to be prepared for optimization.

4.3.4 Define Clusters

Approximately 15-19 eNodeBs should be combined into one cluster. The actual number used is based on the network expansion as well on topographical environment. The clusters are selected to provide a center eNodeB with tow rings of surrounding eNodeBs .4.3.5 Drive Route Planning

Drive routes need to be defined for Sector Verification, Cluster Optimization and System Verification. Coverage prediction plots, morphology and clusters can define all drive test routes.

The drive route should maintain a distance equal to of the cell site radius for sector verification.

The routes for Cluster Optimization shall consist of major roads, highways and hotspots. Total time to drive all routes in atypical cluster should be approximately 6 to 8 hours.

Additional border route is chosen by the way it crosses the cluster borders witout going into the cluster areas.

The System Verification drive route are used to collect the metrics for the Exit Criteria. The routes are a combination of individual clusters.

4.4 RF Optimization Execution The RF Optimization Execution consists of drive tests, problem area identification, verification drives, and final drives to ensure completion of Exit Criteria. The core activity is to provide system tuning, as well as data collection and reporting. LTE network optimization would be performed under loaded network conditions.

4.4.1 Cluster Optimization

The Cluster Optimization consists of three phrases: Unloaded Cluster Optimization Loaded Cluster Optimization

Cluster Performance Verification

During the first Cluster Optimization phase, a measurement drive is performed under unloaded network conditions using the optimization route. Once the data from the first phase are collected, problem spots are identified and optimized. The unloaded drive test identifies coverage holes, handover regions and multiple pilot coverage areas. It also spots eventual overshooting sites (as interferences is minimal) from areas belonging to neighbor clusters. The first pass might lead to correction of neighbor lists and adjustments of the fundamental RF parameters such as transmit powers and/or antenna azimuths and antenna tilts. The drive test information highlights fundamental flaws in the RF design under best-case conditionsThe second Cluster Optimization phase is performed under loaded conditions. The drive routes for the loaded Cluster Optimization will be exactly the same routes as those used for the unloaded measurements drives. Loading the cell will cause an increase of negative SNR valuses, identify potential coverage holes, result in higher BLER, result in lower mobility throughput, and more dropped calls. The objective is to fix the problems observed by the field teams. This involves the fine-tuning of RF parameters such as the transmit power or handover parameters. Antenna re-adjustments (e.g. down-tilts, azimuths, patterns/types or heights) are also occasional performed.The Cluster performance is measured against the cluster Exit Criteria. The exit drives purpose is to verify and to confirm specific Exit Criteria demanded by the customer.

The final statistics from the cluster exit drive are presented to the customer for approval. These statistics contain plots as well as data in tabular form.

4.4.2 System Verification

System Verification is the final phase of the RF Drive Test Based Optimization activity and it focuses specifically on collecting overall performance statistics. It is performed under loaded conditions with all cells activated. System Verification involves fusion of the previously optimized clusters and once again is required to demonstrate that Exit Criteria are met system-wide.

The final statistics from the System Verification are presented to the customer for approval.4.5 Test CasesThe default test mode in the DL will be the open loop spatial multiplexing (OLSM) and for the UL it will be SIMO.DLLS (down-link load simulator) will be used to generate interference on the DL of the neighboring cells for loading purposes. 4.6 Single User Throughput Test Peak4.6.1 Single User Downlink Airlink Throughput Peak TestTest Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under 50% cell loading conditions.

Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM).For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route.

DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of 50%, 50% of the total DL RBs will be occupied.Agilent Tool, Backhual Bandwidth: 100 Mbps, SINR: 17 to 21 dB, Loaded Conditions: 50%

.

Test Routes:

Near Cell, SNR17dB 21 dB

Terminal Speeds:

Stationary, Limited Mobility (25-30)km/hr

Test Set Up:

A drive test van will be used with rooftop mounted antennas1. Ensure that 500MB file is available at the servers for downloads

2. Ensure that Tx/Rx Antenna Correction is below 20%

3. Ensure that UE is reporting Rank of 2

4. Ensure that antenna on the Test van are cross polarized

5. Ensure that TCP Window size of client laptop is set to 512kbytes

Procedure:

ActionResponse

1Park test Van at predetermined location on NC route.

2Open Agilent and connect UE to the Agilent tool.

3Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds

4Open LGE LTE CM and click on ConnectUE would start the attach process

5Ping the Application Server to make sure UE has acquired an IP addressIP address verified

6Open Winds UDP and configure the right adapter. Populate the fields with right values

7Start logging Agilent and click Request on Winds

8Log Data for 3 mins.UE log files collected

9Stop the Winds UDP sessions. Stop logging on Agilent

10Repeat Steps 7 -9 for two more runs

11Save the UE and Winds files

Key Metrics:

1. Physical Layer Downlink Airlink Throughput Peak

2. Application Layer Throughput

3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

Expected Result:

Peak Tput should be 60MbpsExpected Test Duration: 0.5 day

4.6.2 Single User Uplink Airlink Throughput Peak Test

Test Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under no loading.

Test Description:

Tests will be executed in Close-Loop Spatial Multiplexing (CLSM).

For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route.

Agilent Tool, Backhual Bandwidth: 100 Mbps.

Test Routes:

Near Cell, SNR17dB 21 dB

Terminal Speeds:

Stationary, Limited Mobility (25-30)km/hr

Test Set Up:

A drive test van will be used with rooftop mounted antennas1.Ensure that 500MB file is available at the client laptop for uploads

2.Ensure that SIR target is set to 18dB at the eNodeB

Procedure:

ActionResponse

1Park test Van at predetermined location on NC route.

2Open Agilent and connect UE to the Agilent tool.

3Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds

4Open LGE LTE CM and click on ConnectUE would start the attach process

5Ping the Application Server to make sure UE has acquired an IP addressIP address verified

6Open Winds UDP and configure the right adapter. Populate the fields with right values

7Start logging Agilent and click Send on Winds

8Log Data for 3 mins.UE log files collected

9Stop the Winds UDP sessions. Stop logging on Agilent

10Repeat Steps 7 -9 for two more runs

11Save the UE and Winds files

Key Metrics:

1. Physical Layer Uplink Airlink Throughput Peak

2. Application Layer Throughput

3. SIR target

4. Initial Block Error Rates

5. Residual Block Error Rates

6. Scheduled Transport Format distribution

Output:

Physical Layer Uplink Airlink Throughput Peak should be recorded. Expected Result:

Peak Tput should be 20Mbps

Expected Test Duration: 0.5 day

4.6.3 Single User Downlink RLC Throughput Peak TestTest Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under 50% cell loading conditions.

Test Description:

Tests will be executed in Close-Loop Spatial Multiplexing (CLSM).

For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route.

DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of 50%, 50% of the total DL RBs will be occupied.

Agilent Tool, Backhaull Bandwidth: 100 Mbps, SINR: 17 to 21 dB, Loaded Conditions: 50%

.

Test Routes:

Near Cell, SNR17dB 21 dB

Terminal Speeds:

Stationary, Limited Mobility (25-30) km/hr

Test Set Up:

A drive test van will be used with rooftop mounted antennas1. Ensure that 500MB file is available at the servers for downloads

2. Ensure that Tx/Rx Antenna Correction is below 20%

3. Ensure that UE is reporting Rank of 2

4. Ensure that antenna on the Test van are cross polarized

5. Ensure that TCP Window size of client laptop is set to 512kbytes

Procedure:

ActionResponse

1Park test Van at predetermined location on NC route.

2Open Agilent and connect UE to the Agilent tool.

3Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds

4Open LGE LTE CM and click on ConnectUE would start the attach process

5Ping the Application Server to make sure UE has acquired an IP addressIP address verified

6Open Winds UDP and configure the right adapter. Populate the fields with right values

7Start logging Agilent and click Request on Winds

8Log Data for 3 mins.UE log files collected

9Stop the Winds UDP sessions. Stop logging on Agilent

10Repeat Steps 7 -9 for two more runs

11Save the UE and Winds files

Key Metrics:

1. Physical Layer Downlink RLC Throughput Peak

2. Application Layer Throughput

3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

Output:

Peak Downlink RLC layer throughput should be recorded.Expected Result:

Peak Tput should be 55Mbps

Expected Test Duration: 0.5 day

4.6.4 Single User Uplink RLC Throughput Peak Test

Test Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under no loading.

Test Description:

Tests will be executed in Close-Loop Spatial Multiplexing (CLSM).

For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route.

Agilent Tool, Backhaul Bandwidth: 100 Mbps

.

Test Routes:

Near Cell, SNR17dB 21 dB

Terminal Speeds:

Stationary, limited Mobility (25-30) km/hr

Test Set Up:

A drive test van will be used with rooftop mounted antennas1.Ensure that 500MB file is available at the client laptop for uploads

2.Ensure that SIR target is set to 18dB at the eNodeB

Procedure:

ActionResponse

1Park test Van at predetermined location on NC route.

2Open Agilent and connect UE to the Agilent tool.

3Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds

4Open LGE LTE CM and click on ConnectUE would start the attach process

5Ping the Application Server to make sure UE has acquired an IP addressIP address verified

6Open Winds UDP and configure the right adapter. Populate the fields with right values

7Start logging Agilent and click Send on Winds

8Log Data for 3 mins.UE log files collected

9Stop the Winds UDP sessions. Stop logging on Agilent

10Repeat Steps 7 -9 for two more runs

11Save the UE and Winds files

Key Metrics:

1. Physical Layer Uplink RLC Throughput Peak

2. Application Layer Throughput

3. SIR target

4. Initial Block Error Rates

5. Residual Block Error Rates

6. Scheduled Transport Format distribution

Output:

Peak Uplink RLC layer throughput should be recordedExpected Result:

Peak Tput should be 18Mbps

Expected Test Duration: 0.5 day

4.6.5 Single User Downlink Physical Layer Throughput MeanTest Objectives: Evaluate the Average Downlink Physical Layer Throughput in a cluster of 15 20 eNodeBs

Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM).

For the cluster test, the test UE will be driven according to the predefined drive route. The route definition would cover SNR distribution that reflect NC, MC, CE.

DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of 50%, 50% of the total DL RBs will be occupied.

Agilent Tool, Backhaul Bandwidth: 50 Mbps Backhual

Test Setup:A drive test van will be used with rooftop mounted antennas

1. Ensure that 500MB file

Test Routes:

Near Cell, SNR17dB 21 dB

Terminal Speeds:

Stationary, limited Mobility (25-30) km/hr

Test Set Up:

A drive test van will be used with rooftop mounted antennas1.Ensure that 500MB file is available at the client laptop for uploads

2.Ensure that SIR target is set to 18dB at the eNodeB

Procedure:

ActionResponse

1Park test Van at predetermined location on NC route.

2Open Agilent and connect UE to the Agilent tool.

3Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds

4Open LGE LTE CM and click on ConnectUE would start the attach process

5Ping the Application Server to make sure UE has acquired an IP addressIP address verified

6Open Winds UDP and configure the right adapter. Populate the fields with right values

7Start logging Agilent and click Send on Winds

8Log Data for 3 mins.UE log files collected

9Stop the Winds UDP sessions. Stop logging on Agilent

10Repeat Steps 7 -9 for two more runs

11Save the UE and Winds files

Key Metrics:

1. Physical Layer Uplink RLC Throughput Peak

7. Application Layer Throughput

8. SIR target

9. Initial Block Error Rates

10. Residual Block Error Rates

11. Scheduled Transport Format distribution

Output:

Peak Uplink RLC layer throughput should be recorded

Expected Result:

Peak Tput should be 18Mbps

Expected Test Duration: 0.5 day

Key Metrics:

7. Physical Layer Downlink Throughput Mean8. Application Layer Throughput

9. Initial Block Error Rates

10. Residual Block Error Rates

11. Scheduled Transport Format distribution

4.7 Best Effort Sector Throughput Tests4.7.1 Downlink Best Effort Sector Throughput

Test Objectives: Evaluate the sector throughput for multiple UEs at stationary locations and for limited mobility drive tests within the same sector. Tests will be conducted under unloaded and loaded conditions, and for UDP and FTP applications.

Test Description:

Three different scenarios will be tested: Open-Loop Spatial Multiplexing (OLSM), SFBC and SIMOFor the stationary tests, 8 test UEs will be placed at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE) locations. The 8 UEs will be placed in a (2, 4, 2) configuration: 2 UEs at NC, 4 UEs at MC and 2 UEs at CE locations (shown as 242 in Tables below). The 8 UEs will be placed in four different vans (V1-V4 in Tables below) with 2 UEs in each van. See Error! Reference source not found. for SNR ranges. Four different sets of (2, 4, 2) configurations will be tested (Loc1-4 in Tables below).

For the mobility tests the test UEs will be driven according to a predefined limited mobility (single sector) drive route.

DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of X% (CLX in Tables below), X% of the total DL RBs will be occupied. Three different loading conditions will be used: 0%, 50% and 100%.

Three different scenarios of active Sectors will be tested: only target sector active (1S in Tables below), all three sectors of target cell active (3S), and all cells in the cluster active

All tests will be conducted using the default scheduler setting.

Test Setup:1. One or more drive test vans will be used with rooftop mounted antennas

Key Metrics:

1. Physical Layer Throughput

2. Application Layer Throughput (UDP/FTP)3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

4.7.2 Uplink Best Effort Sector ThroughputTest Objectives: Evaluate the sector throughput for multiple UEs at stationary locations in an embedded sector and for limited mobility drive tests within the same sector. Tests will be conducted under unloaded and loaded conditions, and for UDP and FTP applications.

Test Description: For the stationary tests the test UEs will be located at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE). See Error! Reference source not found. for SNR ranges. For the mobility tests the test UEs will be driven according to a predefined limited mobility (single sector) drive route Loading is generated by placing loading UEs in neighboring cells at pre-selected locations. Loading of 100% will result in an IoT of TBD dB in the target cell, while a loading of 50% will result in an IoT of TBD dB in the target cell.

All tests will be conducted with the default scheduler setting.

Test Setup:1. One or more drive test vans will be used with rooftop mounted antennas 2. Each stationary location consists of a unique set of (NC,MC,CE) locations

Key Metrics:

1. Physical Layer Throughput

2. Application Layer Throughput (UDP/FTP)3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

4.7.3 Uplink MU-MIMO Sector Throughput

Test Objectives: Evaluate the sector throughput with MU-MIMO at stationary locations and for limited mobility drive tests. Tests will be conducted under unloaded and loaded conditions, and for UDP and FTP applications.

Test Description: For the stationary tests the test UEs will be located at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE). See Error! Reference source not found. for SNR ranges. For the mobility tests the test UEs will be driven according to a predefined limited mobility (single sector) drive route.

Loading is generated by placing loading UEs in neighboring cells at pre-selected locations. Loading of 100% will result in an IoT of TBD dB in the target cell, while a loading of 50% will result in an IoT of TBD dB in the target cell.

All tests will be conducted using the default scheduler setting.

MU-MIMO implementation allows for up to 4 UEs paired (2 pairs).

Test Setup:1. One or more drive test vans will be used with rooftop mounted antennas2. Each stationary location consists of a unique set of (NC,MC,CE) locations

Key Metrics:

1. Physical Layer Throughput

2. Application Layer Throughput (UDP/FTP)3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

4.8 Downlink Scheduler

Test Objectives: Evaluate the Scheduler performance for multiple UEs at stationary locations and for limited mobility drive tests within the same sector. Tests will be conducted under loaded conditions with UDP and FTP applications. Three Scheduler settings will be tested: proportional-fair (PF), conservative (CO), and aggressive (AG).

Test Description:

Settings common to all tests:

Open-Loop Spatial multiplexing (OLSM) mode

100% loading on DL of neighbor cells

For the stationary tests, 8 test UEs will be placed at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE) locations. The 8 UEs will be placed in four different vans with 2 UEs in each van. The number of active UEs will be increased incrementally to illustrate the scheduling gain. For the stationary cases, each combination of a subset of the 8 UEs will be depicted as a triplet (x, y, z) in the tables below to represent the number of active UEs at each of the three locations. For the cases with a mix of stationary and mobility UEs, each combination of a subset of the 8 UEs will be depicted as a quartet (x, y, z, m) where m will denote the number of mobility UEs. See Error! Reference source not found. for SNR ranges.

For the mobility tests the test UEs will be driven according to a pre defined limited mobility (single sector) drive route.

DLLS will be used to load the DL of the cells neighboring the target cell. Loading of 100% will be used for the tests.

Test Setup:1. One or more drive test vans will be used with rooftop mounted antennas

Key Metrics:

1. Physical Layer Throughput

2. Application Layer Throughput

3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

4.9 Uplink Scheduler

Test Objectives: Evaluate the uplink scheduler performance for multiple UEs at stationary locations and for limited mobility drive tests within the same sector. Tests will be conducted under loaded conditions with UDP and FTP applications. Three Scheduler settings will be tested: proportional-fair (PF), conservative (CO), and aggressive (AG).

Test Description:

For the stationary tests, 8 test UEs will be placed at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE) locations. The 8 UEs will be placed in four different vans with 2 UEs in each van. The number of active UEs will be increased incrementally to illustrate the scheduling gain. For the stationary cases, each combination of a subset of the 8 UEs will be depicted as a triplet (x, y, z) in the tables below to represent the number of active UEs at each of the three locations. For the cases with a mix of stationary and mobility UEs, each combination of a subset of the 8 UEs will be depicted as a quartet (x, y, z, m) where m will denote the number of mobility UEs. See Error! Reference source not found. for SNR ranges.

For the mobility tests the test UEs will be driven according to a pre-defined limited mobility (single sector) drive route.

All tests will be executed with 100% UL loading. The uplink loading will be generated by placing loading UEs in the neighboring cells to generate an IoT corresponding to 100% loading in the target sector.

Test Setup:1. One or more drive test vans will be used with rooftop mounted antennas

Key Metrics:

1. Physical Layer Throughput

2. Application Layer Throughput

3. Initial Block Error Rates

4. Residual Block Error Rates

5. Scheduled Transport Format distribution

4.10 Latency C-planeTest Objectives: To assess the control plane latency associated with call setup events

Test Description:

This test will determine the call setup time. The delay will be measured from the first RACH attempt to the time the UE completes traffic channel setup. This test will be executed with UE in BE and GBR modes. Tests will also be executed to measure mobile terminated connection setup time. These tests will also be executed with UE in BE and GBR modes.

5.5.1 C-Plane Latency

Test CasePriorityTest Case DescriptionCall SetupQoS of the Test UENumber of UEs

5.5.1.1HUE_Init_NC_BEUE InitiatedBE1

5.5.1.2HUE-Init_NC_QoSUE InitiatedGBR1

5.5.1.3HUE_Term_NC_BEUE TerminatedBE1

5.5.1.4HUE_Term_NC_QoSUE TerminatedGBR1

Procedure:

BE C-Plane Latency

1. Set the log mask for the DM tool to include the debug messages.

2. Initiate a call from the test UE with the UE in BE mode

3. Initiate a call from the network side to the UE with UE in BE mode

4. Repeat 100 times each

QoS C-Plane Latency

5. Initiate a call from the test UE with UE in GBR QoS mode

6. Initiate a call from the network side to the UE with UE in GBR QoS mode

7. Repeat 100 times each

Key Metrics (per UE):

1. Call Setup Time

4.11 Latency U-planeTest Objectives: To assess the end user experienced latency. To measure the round trip delay from the time a packet is generated at the IP level to the time a response is received.

Test Description:

This test will be conducted with a total of 8 UEs placed at different sector locations (NC, MC, EC). Ping tests with 32 bytes/1462 bytes will be executed on the test UE (BE/GBR QoS modes) while bi-directional IP traffic will be run on the other UEs to generate DL and UL loading. The number of loading UEs will be varied in the tests.

Procedure:

U-Plane Latency

1. Execute 32 byte Ping tests on the 8 UEs one at a time for 30 seconds each.

2. Execute 32 byte Ping tests on the test UE1 with bi-direction IP traffic running on the other loading UEs (3 UEs, 5 UEs and 7 UEs)

3. Repeat steps 1 and 2 the test with a ping payload size of 1462 Bytes.

4. Repeat steps 1-3 with UE in GBR mode

Key Metrics (per UE):

1. Resource Utilization

2. Transport Format Distribution

3. Latency

4. Physical Layer Throughput

5. Application Layer Throughput {UDP}

6. Initial Block Error Rates

7. Residual Block Error Rates

4.12 Quality of Service

Test Objectives: To assess the QoS performance of an LTE UE with VoIP and HTTP applications in various multi-UE loading scenarios

Test Description:

VoIP QoS Test:

This test will be executed with the test UE in VoIP mode. Tests will be executed under different loading conditions. The loading UEs will be executing BE traffic. The number of loading UEs will be varied from 3 to 7 and they will be placed at a Near Cell location.

HTTP QoS Test:

This test will be executed with the test UE in GBR mode running HTTP application. Tests will be executed under different loading conditions. The loading UEs will be executing BE traffic. The number of loading UEs will be varied from 3 to 7 and they will be placed at a Near Cell location.

Procedure:

GBR VOIP QoS Test

1. Configure test UEs MAC Downlink Scheduler with the following settings:

a. VoIP Flag = True

b. Initial MCS: 5 (QPSK, Code Rate 0.438)

c. HARQ Max Number of Transmissions: 1

2. Configure test UE with the following UL/DL TFT information:

a. Remote IP address/subnet mask

b. Port Range for RTP/RTCP: 10000 10010

c. Protocol: UDP

3. Configure test UE with the following QoS Information:

a. QoS Class Identifier (QCI): 1

b. UL/DL MBR: (not used)

c. UL/DL GBR: (not used)

4. Initiate a call from UE1 to IxChariot to simulate voice traffic. Incrementally add UEs with best effort IP transfers until you have 7 active UEs.

5. Repeat steps 1 4 for the Mid Cell and the Cell Edge geometries

GBR HTTP QoS Test

6. Repeat steps 1-3 and activate IxChariot. Initiate an HTTP session at the Near Cell location. Incrementally add UEs with best effort IP transfers until you have 7 active UEs. Repeat the tests for both the Mid Cell and the Cell Edge locations

Key Metrics (per UE):

1. Resource Utilization

2. Transport Format Distribution

3. Latency

4. Mean Opinion Score

5. Physical Layer Throughput

6. Application Layer Throughput {UDP}

7. Initial Block Error Rates

8. Residual Block Error Rates

4.13 Coverage TestingTest Objectives: Validate the coverage for single-UE tests on the pre-selected drive route. Tests will be conducted under interfered and non-interfered conditions for UDP application.

Test Description: Test UE will be driven according to the pre-selected drive route from Near Cell (NC) to Cell Edge (CE) until call drops. Uplink (UL) interference is generated by placing loading UEs in neighboring cells at pre-selected locations. DLLS will be used to generate DL interference in the neighboring cells.

Interference of 100% will result in an Interference over Thermal (IoT) of TBD dB in the target device (i.e., cell for UL and UE for DL), while interference of 50% will result in an IoT of TBD dB in the target device. Both UL and DL physical-layer data rate and Signal to Interference plus Noise Ratio (SINR) will be measured and signaling will be recorded in the tests.

Procedure:

UL Tests

1. Set SINR target in neighboring cells to control the power of loading UEs

2. Place loading UEs in neighboring cells at pre-selected locations to generate desired IoT in the target cell

3. Make a UDP Best Effort (BE) call to the target cell on the test UE and measure both UL and DL physical-layer data rates, SINR, and record the signaling messages

4. Place the test UE in a van and drive on a pre-selected route from NC to CE of the target cell until the call drops

5. Repeat steps 1 to 4 for each interference condition

DL Tests

1. Set DLLS in neighboring cells to generate desired DL interference levels

2. Make a UDP BE call on the test UE from the target cell and measure both UL and DL physical-layer data rates, SINR, and record the signaling messages

3. Place the test UE in a van and drive on a pre-selected route from NC to CE of the target cell until the call drops

4. Repeat steps 1 to 3 for each interference condition

Key Metrics:

1. Physical Layer Throughput

2. SINR3. PDCCH error rate

4.14 HandoverTest Objectives: Evaluate handover performance in following scenarios:

- Intra-Site (different sectors within one eNodeB)

- Inter-Site (different eNodeBs)

- Loaded and Unloaded Destination eNodeBs

Test Description: Test UE will be driven along two routes: Handover Route comprising of intra and inter eNodeB handovers between 3-4 sectors. On this route, additional UEs will be stationed in each sector along the drive route. These UEs will load both downlink and uplink of their respective sectors with BE traffic.

Cluster Route comprising of intra and inter eNodeB handovers in the entire 10 eNodeB Cluster. Only DL loading will be generated via DLLS on this route.

Non-guaranteed and guaranteed Quality of Service (QoS) tests will be conducted via Best Effort (BE) and Guaranteed Bit Rate (GBR) QoS classes, respectively.

Application performance will be measured quantitatively and subjectively. While quantitative measurements are throughput (physical-layer data rate) and latency, subjective performance will be based on users perception of the application. For example, quality of a Voice over IP (VoIP) call can be either clear, choppy but audible, or not audible.Both UL and DL SINR (Signal to Interference plus Noise Ratio) will be measured and Signaling messages will be recorded in the tests.

Key Metrics:

1. Physical Layer Throughput

2. SINR3. Latency

4.15 V-Pol vs. Cross-Pol

Test Objectives: Compare the performance of Vertically and Cross polarized antenna configurations

Test Description:

Settings common to all tests:

Open-Loop Spatial multiplexing (OLSM) mode

100% loading on DL of neighbor cells

Stationary

A single UE will be used for all the tests. Tests will be executed in NC, MC and EC locations with UDP and FTP applications on both DL and UL.

For the V-pol (vertically polarized) tests, both the eNodeB and UE antennas will be set to the V-pol configurations. Similarly, for the X-pol (cross polarized) tests, both the eNodeB and UE antennas will be set to the X-pol configurations.

DLLS will be used to load the DL of the cells neighboring the target cell. Loading of 100% will be used for all the tests.

Test Setup:1. One test van will be used with rooftop mounted antennas

Procedure:

1. Set the eNB and UE to V-pol configuration2. Run DL/UL UDP/FTP tests in each of NC, MC and EC locations3. Repeat for X-pol configuration setting

Key Metrics:

1. Channel Correlation Statistics at UE for DL tests

2. Physical Layer Throughput

3. Application Layer Throughput

4. Initial Block Error Rates

5. Residual Block Error Rates

6. Scheduled Transport Format distribution

Appendix A: Performance MetricsThe following metrics will be collected during the trial execution phase. The list shall include (but not necessarily limited to):

Air Interface

UE Tx power

RSSI

SINR BLER Retransmission statistics (HARQ and RLC) Transport Format Number of resource blocks (DL/UL)

Channel rank statistics

MIMO mode (Tx diversity or Spatial Multiplexing)

Serving sector

Location (GPS) UE Velocity Throughput

Individual user throughput and aggregated sector throughput

UDP individual user throughput and aggregated sector throughput

TCP individual user throughput and aggregated sector throughput

User statistics (peak rates, average rates, standard deviations)

Latency

U-plane latency

Connection set up times Handover interruption time within the same site and across different sites