After 4 years of rapid deployment in the US, 5G is expected to have significantly improved the performance and overall user experience of mobile networks. However, recent measurement studies have focused either on static performance or a single aspect (e.g., handovers) under driving conditions of 5G, and do not provide a complete picture of cellular network performance today under driving conditions – a major use case of mobile networks.
Researchers at WIoT in collaboration with Purdue University conducted the first in-depth measurement study of the mobile networks of all three major US carriers while driving across the continental US (5700km+, from Los Angeles, CA to Boston, MA). The study spans all cellular technologies available today (LTE/LTE-A, 5G-low/midband/mmWave), all layers of the protocol stack, and multiple 5G “killer” apps (augmented reality, connected autonomous vehicles, 360o video streaming, and mobile cloud gaming). The measurement dataset is publicly available.
The results show disappointingly low and fragmented 5G coverage and poor network performance, even in areas with full high-speed 5G coverage, which, in turn results in poor user QoE for major “5G killer” apps, compared to static conditions. While high-speed 5G can improve the worst-case performance of these apps compared to LTE, and the combination of 5G and edge computing can further boost performance, QoE remains disappointingly low and often no better than over LTE. On the other hand, the study reveals significant diversity in cellular network performance across operators at a given location and time, suggesting that performance under driving can benefit significantly from multi-connectivity solutions that aggregate links from multiple operators. The work also examines the correlation of technology-wise coverage and performance with geo-location and the vehicle’s speed and analyzes the impact of a number of lower-layer key performance indicators (KPIs) on network performance.
Associate Professor of Electrical and Computer Engineering