Change-point for regression with non-stationary errors

Nom de l'orateur
Samir Ben Hariz
Etablissement de l'orateur
Université du Maine
Date et heure de l'exposé
Lieu de l'exposé
Salle des séminaires

We consider the regression model\begin{equation} Y_{i}=g(x_i)+\varepsilon _{i},\,\,\,\,i=0,1,2...,n, \end{equation}where the regression function derivative has a jump point at an unknown position $\theta .$ We propose a nonparametric Kernel-based estimator of the jump location $\theta .$ Assume that $\sup_{\left| i-j\right| \geq k}\left| Cov\left( \varepsilon _{i},\varepsilon _{j}\right) \right|\leq Ck^{-\rho }$ for $0<\rho \leq 1.$ Under very general conditions, we prove the $(nh)^{\frac{-\rho}{2}}$ convergence rate of the estimator, where $h$ is the window of the kernel. This includes short-range dependent as well as long-range dependent and even non-stationary errors. Finally, we gives conditions on the windows $h$ to obtain the best rate of convergence. The obtained rate is known to be optimal for i.i.d. errors as well as for LRD errors