Experimental Estimation of LTE-A Performance
Résumé
In cellular networks, the emergence of machine communications such as connected vehicles increases the high demand of uplink transmissions, thus, degrading the quality of service per user equipment. Enforcing quality-of-service in such cellular network is challenging, as radio phenomena, as well as user (and their devices) mobility and dynamics, are uncontrolled. To solve this issue, estimating what the quality of transmissions will be in a short future for a connected user is essential. For that purpose, we argue that lower layer metrics are a key feature whose evolution can help predict the bandwidth that the considered connections can take advantage of in the following hundreds of milliseconds. The paper then describes how a 4G testbed has been deployed in order to investigate throughput prediction in uplink transmissions at a small time granularity of 100 ms. Based on lower layer metrics (physical and mac layers), the main supervised machine learning algorithms are used, such as Linear Regressor and Random Forest to predict the uplink received bandwidth in different radio phenomena environment. Hence, a deep investigation of the impact of radio issues on bandwidth prediction is conducted. Further, our evaluation shows that the prediction is highly accurate: at the time granularity of 100 ms, the average prediction error is in the range of 6% to 12% for all the scenarios we explored.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...