An increasing number of adaptive protocols use training data to learn optimal parameter choices for adaptation in wireless communication networks. For instance, several recent papers have studied link adaptation protocols based on context information such as node velocity and SNR. However, a number of embedded sensors providing context information frequently report erroneous values, e.g., GPS errors and accelerometer lag, producing incorrect information about motion. As a result, the relationship between the context information and optimal parameter choices that the adaptive algorithm is attempting to establish is erroneous. In this paper, we propose an outlier detection algorithm, which detects the corrupted information due to system errors. The proposed outlier detection algorithm is based on an alternating minimization approach. To evaluate the performance of the proposed algorithm, we apply it to a link-level context-aware rate adaptation system. Numerical results on emulated channels and in-field testing demonstrate that the proposed algorithm increases the prediction accuracy of the optimal transmission mode by 87% and the throughput by 18%.
展开▼