The autocovariance least squares (ALS) method is a computationally efficient approach for estimating noise covariances in Kalman filters without requiring specific noise models. However, conventional ALS and its variants rely on the classic least mean squares (LMS) criterion, making them highly sensitive to measurement outliers and prone to severe performance degradation.
To overcome this limitation, this paper proposes a novel outlier-robust ALS algorithm, termed ALS-IRLS, based on the iteratively reweighted least squares (IRLS) framework. Specifically, the proposed approach introduces a two-tier robustification strategy. First, an innovation-level adaptive thresholding mechanism is employed to filter out heavily contaminated data. Second, the outlier-contaminated autocovariance is formulated using an \( \epsilon \)-contamination model, where the standard LMS criterion is replaced by the Huber cost function.
The IRLS method is then utilized to iteratively adjust data weights based on estimation deviations, effectively mitigating the influence of residual outliers. Comparative simulations demonstrate that ALS-IRLS reduces the root-mean-square error (RMSE) of noise covariance estimates by over two orders of magnitude compared to standard ALS, and significantly enhances downstream state estimation accuracy, outperforming existing outlier-robust Kalman filters.
Our MATLAB implementation demonstrates how ALS-IRLS tackles standard ALS vulnerabilities from two distinct levels:
Before computing the empirical autocovariance, ALS-IRLS employs a Median Absolute Deviation (MAD) robust scale estimator on the innovations \( e_k \). As shown in the detection plot, normal innovations (blue) fall within the adaptive threshold, while severe outliers (red triangles) injected via the \( \epsilon \)-contamination model are successfully flagged and excluded. This prevents the lag-0 autocovariance \( \hat{C}_{e,0} \) from catastrophic inflation.
Fig A: Outlier detection in innovations.
Fig B: IRLS Huber weights assigned to regression equations.
Even after initial filtering, residual outliers may still distort the linear regression \( A\theta = b \). We replace the standard \( \ell_2 \)-norm LMS objective with the robust Huber loss function. Optimized via Iteratively Reweighted Least Squares (IRLS), the algorithm dynamically assigns lower weights to contaminated autocovariance equations based on their regression residuals. Normal equations retain a weight near \(1.0\), while corrupted ones are heavily down-weighted, ensuring an unbiased covariance estimate.
We evaluated the algorithm via 100 Monte Carlo trials on a 3rd-order LTI system. Outliers were injected into the measurement model using an \( \epsilon = 15\% \) contamination rate with a magnitude multiplier of \( 8\times \).
Left panel: The joint scatter plot of estimated \( (\hat{Q}_w, \hat{R}_v) \). Standard ALS estimates (blue dots) are highly dispersed and severely biased towards inflated values due to the outlier contamination. Conversely, the proposed ALS-IRLS estimates (red dots) are tightly clustered around the true covariance values (black asterisk, \( Q=5, R=3 \)).
Right panel: The marginal histogram of the estimated process noise covariance \( \hat{Q}_w \). The ALS method completely fails, spreading predictions far beyond the true value. ALS-IRLS consistently reconstructs the correct noise statistics with extremely low variance.
To test the boundary of our algorithm, we swept the outlier rate \( \epsilon \) from \( 0\% \) up to \( 30\% \). As \( \epsilon \) grows, the RMSE of standard ALS grows exponentially because contaminated data dominates the least-squares solution. Remarkably, ALS-IRLS maintains a near-constant low RMSE across all contamination levels, experimentally validating the 50% theoretical breakdown point inherited from our MAD-based architecture.
Fig C: Covariance RMSE vs. Contamination Rate (\( \epsilon \)).
The ultimate goal of covariance estimation is to provide accurate parameters for the Kalman Filter. In an outlier-free evaluation phase, we compared state estimation RMSE across 5 paradigms.
Because the true noise statistics are unknown, filter-level robust methods (like Student-t KF [1] and MCKF [2]) operating with misspecified initial covariances suffer severely (inflated RMSE). The KF relying on standard ALS performs the worst due to catastrophic \( \hat{Q} \) inflation.
By accurately identifying the true noise statistics online, KF + ALS-IRLS (red bar) substantially outperforms all non-Oracle baselines, achieving an RMSE within 9% of the ideal Oracle Lower Bound (black bar).
@article{li2026outlier,
title={Outlier-robust Autocovariance Least Square Estimation via Iteratively Reweighted Least Square},
author={Li, Jiahong and Deng, Fang},
journal={arXiv preprint arXiv:2603.08158},
year={2026}
}