O NADEZhNOM VOSSTANOVLENII SIGNALOV PO NEPRYaMYM NABLYuDENIYaM

封面

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅或者付费存取

详细

Рассматривается линейная обратная задача с неопределенностью, где требуется восстановить неизвестный сигнал по зашумленным наблюдениям. Исследуются свойства устойчивых полиэдральных оценок для случаев ограниченного и разреженного загрязнения. Показано, как такие оценки могут быть построены с помощью процедур выпуклой оптимизации.

参考

  1. Juditsky A., Nemirovski A. On polyhedral estimation of signals via indirect observations // Electronic Journal of Statistics. 2020. V. 14. No. 1. P. 458–502.
  2. Juditsky A., Nemirovski A. Statistical Inference via Convex Optimization. Princeton: Princeton University Press, 2020.
  3. Tukey J.W. A survey of sampling from contaminated distributions / Contributions to Probability and Statistics. 1960. P. 448–485.
  4. Huber P.J. Robust Statistics. New York: Wiley, 1981.
  5. Yu C., Yao W. Robust linear regression: A review and comparison // Communications in Statistics – Simulation and Computation. 2017. V. 46. No. 8. P. 6261–6282.
  6. Поляк В.Т., Цыпкин Я.З. Адаптивные алгоритмы оценивания (сходимость, оптимальность, стабильность) // АиТ. 1979. № 3. С. 71–84.
  7. Polyak B.T., Tsypkin Ya.Z. Robust identification // Automatica. 1980. V. 16. No. 1. P. 53–63.
  8. Поляк В.Т., Цыпкин Я.З. Робастные псевдоградиентные алгоритмы адаптации // АиТ. 1980. № 10. С. 91–97.
  9. Polyak B.T., Tsypkin Ya.Z. Optimal and robust methods for unconditional optimization // IFAC Proceedings Volumes. 1981. V. 14. No. 2. P. 519–523.
  10. Поляк В.Т., Цыпкин Я.З. Критериальные алгоритмы стохастической оптимизации // АиТ. 1984. № 6. С. 95–104.
  11. Polyak B.T., Tsypkin Ya.Z. Optimal recurrent algorithms for identification of nonstationary plants // Computers & Electrical Engineering. 1992. V. 18. No. 5. P. 365–371.
  12. Chen Y., Caramanis C., Mannor Sh. Robust sparse regression under adversarial corruption // International Conference on Machine Learning. PMLR, 2013. P. 774–782.
  13. Balakrishnan S., Du Simon S., Li J., Singh A. Computationally efficient robust sparse estimation in high dimensions // Conference on Learning Theory. PMLR, 2017. P. 169–212.
  14. Diakonikolas I., Kong W., Stewart A. Efficient algorithms and lower bounds for robust linear regression // Proceedings of the Thirtieth Annual ACM–SIAM Symposium on Discrete Algorithms. SIAM, 2019. P. 2745–2754.
  15. Liu L., Shen Y., Li T., Caramanis C. High dimensional robust sparse regression // International Conference on Artificial Intelligence and Statistics. PMLR, 2020. P. 411–421.
  16. Minsker S., Ndaoud M., Wang L. Robust and tuning-free sparse linear regression via square-root slope // SIAM J. Math. Data Sci. 2024. V. 6. No. 2. P. 428–453
  17. Foygel R., Mackey L. Corrupted sensing: Novel guarantees for separating structured signals // IEEE Transactions on Information Theory. 2014. V. 60. No. 2. P. 1223–1247.
  18. Dalalyan A., Thompson Ph. Outlier-robust estimation of a sparse linear model using l1-penalized Huber’s M-estimator // Advances in Neural Information Processing Systems. 2019. V. 32.
  19. Bruce A.G., Donoho D.L., Gao H.-Y., Martin R.D. Denoising and robust nonlinear wavelet analysis // Wavelet Applications. SPIE, 1994. V. 2242. P. 325–336.
  20. Sardy S., Tseng P., Bruce A. Robust wavelet denoising // IEEE Transactions on Signal Processing. 2001. V. 49. No. 6. P. 1146–1152.
  21. Diakonikolas I., Kane D.M. Algorithmic High-Dimensional Robust Statistics. Cambridge: Cambridge University Press, 2023.
  22. Juditsky A., Nemirovski A. Near-optimality of linear recovery from indirect observations // Mathematical Statistics and Learning. 2018. V. 1. No. 2. P. 171–225.
  23. Donoho D.L. Statistical estimation and optimal recovery // The Annals of Statistics. 1994. V. 22. No. 1. P. 238–270.
  24. Juditsky A.B., Nemirovski A.S. Nonparametric estimation by convex programming // The Annals of Statistics. 2009. V. 37. No. 5A. P. 2278–2300.
  25. Juditsky A., Nemirovski A. Near-optimality of linear recovery in Gaussian observation scheme under ∥·∥2-loss // The Annals of Statistics. 2018. V. 46. No. 4. P. 1603–1629.
  26. Grant M., Boyd S. The CVX Users’ Guide. Release 2.1, 2014. https://web.cvxr.com/cvx/doc/CVX.pdf
  27. Micchelli C.A., Rivlin T.J. A Survey of optimal recovery / Optimal Estimation in Approximation Theory. Micchelli C.A., Rivlin T.J. (Eds.). Boston, MA: Springer, 1977. P. 1–54.
  28. Micchelli C.A., Rivlin T.J. Lectures on optimal recovery / Numerical Analysis Lancaster 1984. Lecture Notes in Mathematics. Turner P.R. (Ed.). Berlin–Heidelberg: Springer, 1985. V. 1129. P. 21–93.
  29. Черноусоко Ф.Л. Оценивание фазового состояния динамических систем. М.: Наука, 1988.
  30. Fogel E., Huang Y.-F. On the value of information in system identification-bounded noise case // Automatica. 1982. V. 18. No. 2. P. 229–238.
  31. Граничин О.Н., Поляк Б.Т. Рандомизированные алгоритмы оценивания и оптимизации при почти произвольных помехах. М.: Наука, 2003.
  32. Kurzhanski A.B. Identification – a theory of guaranteed estimates // From Data to Model. Willems J.C. (Ed.). Berlin–Heidelberg: Springer, 1989. P. 135–214.
  33. Kurzhanski A., Vályi I. Ellipsoidal Calculus for Estimation and Control. Boston, MA: Birkhäuser, 1997.
  34. Milanese M., Vicino A. Optimal estimation theory for dynamic systems with set membership uncertainty: An overview // Automatica. 1991. V. 27. No. 6. P. 997–1009.
  35. Schwerpe F.C. Uncertain Dynamic Systems. Englewood Cliffs, NJ: Prentice-Hall, 1973.
  36. Juditsky A., Nemirovski A. On design of polyhedral estimates in linear inverse problems // SIAM Journal on Mathematics of Data Science. 2024. V. 6. No. 1. P. 76–96.
  37. Candes E.J., Tao T. Decoding by linear programming // IEEE Transactions on Information Theory. 2005. V. 51. No. 12. P. 4203–4215.
  38. Bickel P.J., Ritov Ya., Tsybakov A.B. Simultaneous analysis of Lasso and Dantzig selector // The Annals of Statistics. 2009. V. 37. No. 4. P. 1705–1732.
  39. Donoho D.L., Huo X. Uncertainty principles and ideal atomic decomposition // IEEE Transactions on Information Theory. 2001. V. 47. No. 7. P. 2845–2862.
  40. van de Geer S. Estimation and Testing under Sparsity. Cham: Springer, 2016.
  41. Mosek ApS. The MOSEK optimization toolbox for MATLAB manual. Version 8.0, 2015. http://docs.mosek.com/8.0/toolbox/

补充文件

附件文件
动作
1. JATS XML

版权所有 © The Russian Academy of Sciences, 2025