We consider the regression model\begin{equation} Y_{i}=g(x_i)+\varepsilon _{i},\,\,\,\,i=0,1,2...,n, \end{equation}where the regression function derivative has a jump point at an unknown position $\theta .$ We propose a nonparametric Kernel-based estimator of the jump location $\theta .$ Assume that $\sup_{\left| i-j\right| \geq k}\left| Cov\left( \varepsilon _{i},\varepsilon _{j}\right) \right|\leq Ck^{-\rho }$ for $0<\rho \leq 1.$ Under very general conditions, we prove the $(nh)^{\frac{-\rho}{2}}$ convergence rate of the estimator, where $h$ is the window of the kernel. This includes short-range dependent as well as long-range dependent and even non-stationary errors. Finally, we gives conditions on the windows $h$ to obtain the best rate of convergence. The obtained rate is known to be optimal for i.i.d. errors as well as for LRD errors
L'Université de Nantes fête la science du 10 au 12 octobre 2014.