Volume 19, 2015
|Page(s)||1 - 23|
|Published online||17 March 2015|
Iterative isotonic regression
UniversitéRennes 2, INRIA and IRMAR, Campus de
2 Los Alamos National Laboratory, NM 87545, Los Alamos, USA
3 Université Rennes 2, Campus de Villejean, 35043 Rennes, France
Revised: 23 January 2014
This article explores some theoretical aspects of a recent nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a non-decreasing function and a non-increasing function. This suggests combining the backfitting algorithm for estimating additive functions with isotonic regression for estimating monotone functions. The resulting iterative algorithm is called Iterative Isotonic Regression (I.I.R.). The main result in this paper states that the estimator is consistent if the number of iterations kn grows appropriately with the sample size n. The proof requires two auxiliary results that are of interest in and by themselves: firstly, we generalize the well-known consistency property of isotonic regression to the framework of a non-monotone regression function, and secondly, we relate the backfitting algorithm to von Neumann’s algorithm in convex analysis. We also analyse how the algorithm can be stopped in practice using a data-splitting procedure.
Mathematics Subject Classification: 52A05 / 62G08 / 62G20
Key words: Nonparametric statistics / isotonic regression / additive models / metric projection onto convex cones
© EDP Sciences, SMAI, 2015
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.