SummaryThe paper proposes a test for constant correlations that allow for breaks at unknown times in the marginal means and variances. Theoretically and in an application to US and German stock returns, we find that not accounting for changes in the marginal moments has severe consequences. This is because incorrect standardization of the series transfers to the sample correlations onto which the tests are built. Correcting for variance breaks at unknown time will have an asymptotic effect. To discuss adjustments, we tackle the issue more generally by considering partial-sums-based inference on moment properties of unobserved processes that is conducted on the basis of estimated counterparts obtained in a preliminary step. The paper gives a characterization of the conditions under which the effect of filtering does not vanish asymptotically. The analysis extends to models with breaks in parameters at estimated time.
SummaryThis study examines high-dimensional forecasting and variable selection via folded-concave penalized regressions. The penalized regression approach leads to sparse estimates of the regression coefficients and allows the dimensionality of the model to be much larger than the sample size. First, we discuss the theoretical aspects of a penalized regression in a time series setting. Specifically, we show the oracle inequality with ultra-high-dimensional time-dependent regressors. Then we show the validity of the penalized regression using two empirical applications. First, we forecast quarterly US gross domestic product data using a high-dimensional monthly data set and the mixed data sampling (MIDAS) framework with penalization. Second, we examine how well the penalized regression screens a hidden portfolio based on a large New York Stock Exchange stock price data set. Both applications show that a penalized regression provides remarkable results in terms of forecasting performance and variable selection.
SummaryThe two-stage least-squares (2SLS) instrumental-variables (IV) estimator for the parameters in linear models with a single endogenous variable is shown to be identical to an optimal minimum-distance (MD) estimator based on the individual instrument-specific IV estimators. The 2SLS estimator is a linear combination of the individual estimators, with the weights determined by their variances and covariances under conditional homoskedasticity. It is further shown that the Sargan test statistic for overidentifying restrictions is the same as the MD criterion test statistic. This provides an intuitive interpretation of the Sargan test. The equivalence results also apply to the efficient two-step generalized method of moments and robust optimal MD estimators and criterion functions, allowing for general forms of heteroskedasticity. It is further shown how these results extend to the linear overidentified IV model with multiple endogenous variables.
SummaryThis paper provides asymptotic optimality results for panel unit root tests with covariates by deriving the Gaussian power envelope. The main conclusion is that the use of covariates holds considerable promise in the panel data context, much more so than in the time series context. In fact, the use of the covariates not only leads to increased power, but can actually have an order effect on the shrinking neighbourhoods around unity for which power is non-negligible.
SummaryThis paper considers a moderately explosive AR(1) process where the autoregressive root approaches unity from the right at a certain rate. We first develop a test for the null of moderate explosiveness under independent and identically distributed errors. We show that the t statistic is asymptotically standard normal regardless of whether the true process is dominated by the stochastic moderately explosive trend or the deterministic nonlinear drift trend. This result is in sharp contrast with the existing literature, wherein nonstandard limiting distributions are obtained under different model assumptions. When the errors are weakly dependent, we show that the t statistic based on a heteroskedasticity and autocorrelation robust standard error follows Student’s t distribution in large samples. Monte Carlo simulations show that our tests have satisfactory size and power performances in finite samples. Applying the asymptotic t test to ten major stock indexes in the pre-2008 financial exuberance period, we find that most indexes are only mildly explosive or not explosive at all, which implies that the bout of the irrational rise was not as serious as previously thought.
SummaryA common concern in the empirical study of auctions is the likely presence of auction-specific factors that are common knowledge among bidders but unobserved to the econometrician. Such unobserved heterogeneity confounds attempts to uncover the underlying structure of demand and information, typically a primary feature of interest in an auction market. Unobserved heterogeneity presents a particular challenge in first-price auctions, where identification arguments rely on the econometrician’s ability to reconstruct from observables the conditional probabilities that entered each bidder’s equilibrium optimization problem. When bidders condition on unobservables, it is not obvious that this is possible. Here we discuss several approaches to identification developed in recent work on first-price auctions with unobserved heterogeneity. Despite the special challenges of this setting, all of the approaches build on insights developed in other areas of econometrics, including those on control functions, measurement error, and mixture models. Because each strategy relies on different combinations of model restrictions, technical assumptions, and data requirements, their relative attractiveness will vary with the application. However, this varied menu of results suggests both a type of robustness of identifiability and the potential for expanding the frontier with additional work.