Abstract: In this paper, we consider the estimation of a panel data model where the heterogeneity term is arbitrarily correlated with the covariates and the coefficients are unknown functions of some explanatory variables. The estimator is based in a deviation from the mean transformation of the regression model and then a local linear regression is applied to estimate the unknown varying coefficient functions. It turns out that the standard use of this technique rends a non-negligible asymptotic bias. In order to avoid it, in the estimation procedure, we introduce a high dimensional kernel weight. As a consequence, the resulting estimator shows a bias that asymptotically tends to zero at usual nonparametric rates. However, the variance is enlarged, and therefore the estimator shows a very slow rate of convergence. In order to achieve the optimal rate, we propose a one-step backfitting algorithm. The resulting two step estimator is shown to be asymptotically normal and its rate of convergence is optimal within its class of smoothness functions. Furthermore, the estimator is oracle efficient. Finally, we show some Monte Carlo results that confirm the theoretical findings.