Econometrica: Mar 2021, Volume 89, Issue 2

Errors in the Dependent Variable of Quantile Regression Models
p. 849-873

Jerry Hausman, Haoyang Liu, Ye Luo, Christopher Palmer

We study the consequences of measurement error in the dependent variable of random‐coefficients models, focusing on the particular case of quantile regression. The popular quantile regression estimator of Koenker and Bassett (1978) is biased if there is an additive error term. Approaching this problem as an errors‐in‐variables problem where the dependent variable suffers from classical measurement error, we present a sieve maximum likelihood approach that is robust to left‐hand‐side measurement error. After providing sufficient conditions for identification, we demonstrate that when the number of knots in the quantile grid is chosen to grow at an adequate speed, the sieve‐maximum‐likelihood estimator is consistent and asymptotically normal, permitting inference via bootstrapping. Monte Carlo evidence verifies our method outperforms quantile regression in mean bias and MSE. Finally, we illustrate our estimator with an application to the returns to education highlighting changes over time in the returns to education that have previously been masked by measurement‐error bias.

Log In To View Full Content

Supplemental Material

Supplement to "Errors in the Dependent Variable of Quantile Regression Models"

This zip file contains the replication files for the manuscript.

Read More View ZIP

Supplement to "Errors in the Dependent Variable of Quantile Regression Models"

In this appendix, we present implementation details for our maximum-likelihood estimator.

Read More View PDF