Volume 87, Issue 4 (July 2019)
The full content of the journal is accessible at
An Equilibrium Model of the African HIV/AIDS Epidemic
Jeremy Greenwood, Philipp Kircher, Cezar Santos, Michèle Tertilt
Twelve percent of the Malawian population is HIV infected. Eighteen percent of sexual encounters are casual. A condom is used a third of the time. To analyze the Malawian epidemic, a choice‐theoretic general equilibrium search model is constructed. In the developed framework, people select between different sexual practices while knowing the inherent risk. The calibrated model is used to study several policy interventions, namely, ART, circumcision, better condoms, and the treatment of other STDs. The efficacy of public policy depends upon the induced behavioral changes and equilibrium effects. The framework complements the insights from epidemiological studies and small‐scale field experiments.
Preferences for Truth-telling
Johannes Abeler, Daniele Nosenzo, Collin Raymond
Private information is at the heart of many economic activities. For decades, economists have assumed that individuals are willing to misreport private information if this maximizes their material payoff. We combine data from 90 experimental studies in economics, psychology, and sociology, and show that, in fact, people lie surprisingly little. We then formalize a wide range of potential explanations for the observed behavior, identify testable predictions that can distinguish between the models, and conduct new experiments to do so. Our empirical evidence suggests that a preference for being seen as honest and a preference for being honest are the main motivations for truth‐telling.
The Macroeconomic Impact of Microeconomic Shocks: Beyond Hulten's Theorem
David Rezza Baqaee, Emmanuel Farhi
We provide a nonlinear characterization of the macroeconomic impact of microeconomic productivity shocks in terms of reduced‐form nonparametric elasticities for efficient economies. We also show how microeconomic parameters are mapped to these reduced‐form general equilibrium elasticities. In this sense, we extend the foundational theorem of Hulten (1978) beyond the first order to capture nonlinearities. Key features ignored by first‐order approximations that play a crucial role are: structural microeconomic elasticities of substitution, network linkages, structural microeconomic returns to scale, and the extent of factor reallocation. In a business‐cycle calibration with sectoral shocks, nonlinearities magnify negative shocks and attenuate positive shocks, resulting in an aggregate output distribution that is asymmetric (negative skewness), fat‐tailed (excess kurtosis), and has a negative mean, even when shocks are symmetric and thin‐tailed. Average output losses due to short‐run sectoral shocks are an order of magnitude larger than the welfare cost of business cycles calculated by Lucas (1987). Nonlinearities can also cause shocks to critical sectors to have disproportionate macroeconomic effects, almost tripling the estimated impact of the 1970s oil shocks on world aggregate output. Finally, in a long‐run growth context, nonlinearities, which underpin Baumol's cost disease via the increase over time in the sales shares of low‐growth bottleneck sectors, account for a 20 percentage point reduction in aggregate TFP growth over the period 1948–2014 in the United States.
Retirement Financing: An Optimal Reform Approach
Roozbeh Hosseini, Ali Shourideh
We study Pareto optimal policy reforms aimed at overhauling retirement financing as an integral part of the tax and transfer system. Our framework for policy analysis is a heterogeneous‐agent overlapping‐generations model that performs well in matching the aggregate and distributional features of the U.S. economy. We present a test of Pareto optimality that identifies the main source of inefficiency in the status quo policies. Our test suggests that lack of asset subsidies late in life is the main source of inefficiency when annuity markets are incomplete. We solve for Pareto optimal policy reforms and show that progressive asset subsidies provide a powerful tool for Pareto optimal reforms. On the other hand, earnings tax reforms do not always yield efficiency gains. We implement our Pareto optimal policy reform in an economy that features demographic change. The reform reduces the present discounted value of net resources consumed by each generation by about 7 to 11 percent in the steady state. These gains amount to a one‐time lump‐sum transfer to the initial generation equal to 10.5 percent of GDP.
Inference in Group Factor Models With an Application to Mixed-Frequency Data
E. Andreou, P. Gagliardini, E. Ghysels, M. Rubin
We derive asymptotic properties of estimators and test statistics to determine—in a grouped data setting—common versus group‐specific factors. Despite the fact that our test statistic for the number of common factors, under the null, involves a parameter at the boundary (related to unit canonical correlations), we derive a parameter‐free asymptotic Gaussian distribution. We show how the group factor setting applies to mixed‐frequency data. As an empirical illustration, we address the question whether Industrial Production (IP) is still the dominant factor driving the U.S. economy using a mixed‐frequency data panel of IP and non‐IP sectors. We find that a single common factor explains 89% of IP output growth and 61% of total GDP growth despite the diminishing role of manufacturing.
Measuring Group Differences in High-Dimensional Choices: Method and Application to Congressional Speech
Matthew Gentzkow, Jesse M. Shapiro, Matt Taddy
We study the problem of measuring group differences in choices when the dimensionality of the choice set is large. We show that standard approaches suffer from a severe finite‐sample bias, and we propose an estimator that applies recent advances in machine learning to address this bias. We apply this method to measure trends in the partisanship of congressional speech from 1873 to 2016, defining partisanship to be the ease with which an observer could infer a congressperson's party from a single utterance. Our estimates imply that partisanship is far greater in recent years than in the past, and that it increased sharply in the early 1990s after remaining low and relatively constant over the preceding century.
Many decision situations involve two or more of the following divergences from subjective expected utility: imprecision of beliefs (or ambiguity), imprecision of tastes (or multi‐utility), and state dependence of utility. This paper proposes and characterizes a model of uncertainty averse preferences that can simultaneously incorporate all three phenomena. The representation supports a principled separation of (imprecise) beliefs and (potentially state‐dependent, imprecise) tastes. Moreover, the representation permits comparative statics separating the roles of beliefs and tastes, and is modular: it easily delivers special cases involving various combinations of the phenomena, as well as state‐dependent multi‐utility generalizations covering popular ambiguity models.
Equivalence of Stochastic and Deterministic Mechanisms
Yi‐Chun Chen, Wei He, Jiangtao Li, Yeneng Sun
We consider a general social choice environment that has multiple agents, a finite set of alternatives, independent types, and atomless type distribution. We show that for any Bayesian incentive compatible mechanism, there exists an equivalent deterministic mechanism that (1) is Bayesian incentive compatible; (2) delivers the same interim expected allocation probabilities and the same interim expected utilities for all agents; and (3) delivers the same ex ante expected social surplus. This result holds in settings with a rich class of utility functions, multidimensional types, interdependent valuations, and in settings without monetary transfers. To prove our result, we develop a novel methodology of mutual purification, and establish its link with the mechanism design literature.
Strong duality in monopoly pricing
Andreas Kleiner, Alejandro Manelli
The main result in Daskalakis, Deckelbaum, and Tzamos (2017) establishes strong duality in the monopoly problem with an argument based on transportation theory. We provide a short, alternative proof using linear programming.
Confidence Intervals for Projections of Partially Identified Parameters
Hiroaki Kaido, Francesca Molinari, Jörg Stoye
We propose a bootstrap‐based calibrated projection procedure to build confidence intervals for single components and for smooth functions of a partially identified parameter vector in moment (in)equality models. The method controls asymptotic coverage uniformly over a large class of data generating processes. The extreme points of the calibrated projection confidence interval are obtained by extremizing the value of the function of interest subject to a proper relaxation of studentized sample analogs of the moment (in)equality conditions. The degree of relaxation, or critical level, is calibrated so that the function of θ, not θ itself, is uniformly asymptotically covered with prespecified probability. This calibration is based on repeatedly checking feasibility of linear programming problems, rendering it computationally attractive.