ADVANCES IN ECONOMETRICS

Series Editors: Thomas B. Fomby and R. Carter Hill

Recent Volumes:

Volume 15:

Nonstationary Panels, Panel Cointegration,

and Dynamic Panels, Edited by Badi Baltagi

Volume 16:

Econometric Models in Marketing, Edited

by P. H. Franses and A. L. Montgomery

Volume 17:

Maximum Likelihood Estimation of

Misspeciﬁed Models: Twenty Years Later,

Edited by Thomas B. Fomby and

R. Carter Hill

Volume 18:

Spatial and Spatiotemporal Econometrics,

Edited by J. P. LeSage and R. Kelley Pace

Volume 19:

Applications of Artiﬁcial Intelligence in

Finance and Economics, Edited by

J. M. Binner, G. Kendall and S. H. Chen

Volume 20A:

Econometric Analysis of Financial and

Economic Time Series, Edited by

Dek Terrell and Thomas B. Fomby

Volume 20B:

Econometric Analysis of Financial and

Economic Time Series, Edited by

Thomas B. Fomby and Dek Terrell

Volume 21:

Modelling and Evaluating Treatment Effects

in Econometrics, Edited by Daniel L. Millimet, Jeffrey A. Smith and Edward J. Vytlacil

Volume 22:

Econometrics and Risk Management,

Edited by Thomas B. Fomby, Knut Solna

and Jean-Pierre Fouque

ADVANCES IN ECONOMETRICS

VOLUME 23

BAYESIAN

ECONOMETRICS

EDITED BY

SIDDHARTHA CHIB

Olin Business School, Washington University

WILLIAM GRIFFITHS

Department of Economics, University of Melbourne

GARY KOOP

Department of Economics, University of Strathclyde

DEK TERRELL

Department of Economics, Louisiana State University

United Kingdom – North America – Japan

India – Malaysia – China

JAI Press is an imprint of Emerald Group Publishing Limited

Howard House, Wagon Lane, Bingley BD16 1WA, UK

First edition 2008

Copyright r 2008 Emerald Group Publishing Limited

Reprints and permission service

Contact: booksandseries@emeraldinsight.com

No part of this book may be reproduced, stored in a retrieval system, transmitted in any

form or by any means electronic, mechanical, photocopying, recording or otherwise

without either the prior written permission of the publisher or a licence permitting

restricted copying issued in the UK by The Copyright Licensing Agency and in the USA

by The Copyright Clearance Center. No responsibility is accepted for the accuracy of

information contained in the text, illustrations or advertisements. The opinions expressed

in these chapters are not necessarily those of the Editor or the publisher.

British Library Cataloguing in Publication Data

A catalogue record for this book is available from the British Library

ISBN: 978-1-84855-308-8

ISSN: 0731-9053 (Series)

Awarded in recognition of

Emerald’s production

department’s adherence to

quality systems and processes

when preparing scholarly

journals for print

LIST OF CONTRIBUTORS

Michael K. Andersson

Sveriges Riksbank, Stockholm, Sweden

Veni Arakelian

Department of Economics, University of

Crete, Rethymno, Greece

Chun-man Chan

Hong Kong Community College,

Kowloon, Hong Kong, China

Cathy W. S. Chen

Department of Statistics, Feng Chia

University, Taiwan

Siddhartha Chib

Olin Business School, Washington

University, St. Louis, MO

S. T. Boris Choy

Discipline of Operations Management

and Econometrics, University of Sydney,

NSW, Australia

Michiel de Pooter

Division of International Finance,

Financial Markets, Board of Governors

of the Federal Reserve System,

Washington, DC

Dipak K. Dey

Department of Statistics, University of

Connecticut, Storrs, CT

Deborah Gefang

Department of Economics, University of

Leicester, Leicester, UK

Richard Gerlach

Discipline of Operations Management

and Econometrics, University of Sydney,

NSW, Australia

Paolo Giordani

Research Department, Sveriges

Riksbank, Stockholm, Sweden

Jennifer Graves

Department of Economics, University of

California, Irvine, CA

ix

x

LIST OF CONTRIBUTORS

William Grifﬁths

Department of Economics, University of

Melbourne, Vic., Australia

Ariun Ishdorj

Department of Economics, Iowa State

University, Ames, IA

Liana Jacobi

Department of Economics, University of

Melbourne, Vic., Australia

Ivan Jeliazkov

Department of Economics, University of

California, Irvine, CA

Helen H. Jensen

Department of Economics, Iowa State

University, Ames, IA

Swedish Business School, O¨rebo

University, O¨rebo, Sweden

Sune Karlsson

Robert Kohn

Department of Economics,

Australian School of Business,

University of New South Wales,

Sydney, Australia

Gary Koop

Department of Economics, University of

Strathclyde, Glasgow, UK

Dimitris Korobilis

Department of Economics, University of

Strathclyde, Glasgow, UK

Subal C. Kumbhakar

Department of Economics, State

University of New York, Binghamton,

NY

Mark Kutzbach

Department of Economics, University of

California, Irvine, CA

Roberto Leon-Gonzalez

National Graduate Institute for Policy

Studies (GRIPS), Tokyo, Japan

De´partement de Sciences E´conomiques,

Universite´ de Montre´al, CIREQ,

Canada

Brahim Lgui

Arto Luoma

Department of Mathematics and

Statistics, University of Tampere,

Tampere, Finland

xi

List of Contributors

Jani Luoto

William J. McCausland

School of Business and Economics,

University of Jyva¨skyla¨, Jyva¨skyla¨,

Finland

De´partement de Sciences E´conomiques,

Universite´ de Montre´al, CIREQ and

CIRANO, Montre´al, QC, Canada

Nadine McCloud

Department of Economics, The

University of the West Indies, Mona,

Kingston, Jamaica

Murat K. Munkin

Department of Economics, University of

South Florida, Tampa, FL

Christopher J. O’Donnell

School of Economics, University of

Queensland, Brisbane, Australia

Francesco Ravazzolo

Norges Bank, Oslo, Norway

Vanessa Rayner

School of Economics, University of

Queensland, Brisbane, Australia

Rene Segers

Tinbergen Institute and Econometric

Institute, Erasmus University

Rotterdam, Rotterdam, The

Netherlands

Mike K. P. So

Department of ISOM, Hong Kong

University of Science and Technology,

Kowloon, Hong Kong

Rodney Strachan

School of Economics, The University of

Queensland, Brisbane, Australia

Sylvie Tchumtchoua

Department of Statistics, University of

Connecticut, Storrs, CT

Dek Terrell

Department of Economics, Louisiana

State University, Baton Rouge, LA

Justin Tobias

Department of Economics, Purdue

University, West Lafayette, IN

Pravin K. Trivedi

Department of Economics, Wylie Hall,

Indiana University, Bloomington, IN

xii

LIST OF CONTRIBUTORS

Efthymios G. Tsionas

Department of Economics, Athens

University of Economics and Business,

Athens, Greece

Herman K. van Dijk

Tinbergen Institute and Econometric

Institute, Erasmus University

Rotterdam, Rotterdam, The

Netherlands

Wai-yin Wan

School of Mathematics and Statistics,

University of Sydney, NSW, Australia

Arnold Zellner

Graduate School of Business, University

of Chicago, Chicago, IL

BAYESIAN ECONOMETRICS: AN

INTRODUCTION

Siddhartha Chib, William Grifﬁths, Gary Koop and

Dek Terrell

ABSTRACT

Bayesian Econometrics is a volume in the series Advances in Econometrics

that illustrates the scope and diversity of modern Bayesian econometric

applications, reviews some recent advances in Bayesian econometrics, and

highlights many of the characteristics of Bayesian inference and

computations. This ﬁrst paper in the volume is the Editors’ introduction

in which we summarize the contributions of each of the papers.

1. INTRODUCTION

In 1996 two volumes of Advances in Econometrics were devoted to Bayesian

econometrics. One was on computational methods and applications and the

other on time-series applications. This was a time when Markov chain Monte

Carlo (MCMC) techniques, which have revolutionized applications of

Bayesian econometrics, had started to take hold. The adaptability of MCMC

to problems previously considered too difﬁcult was generating a revival of

interest in the Bayesian paradigm. Now, 12 years later, it is time for another

Advances volume on Bayesian econometrics. Use of Bayesian techniques has

Bayesian Econometrics

Advances in Econometrics, Volume 23, 3–9

Copyright r 2008 by Emerald Group Publishing Limited

All rights of reproduction in any form reserved

ISSN: 0731-9053/doi:10.1016/S0731-9053(08)23021-5

3

4

SIDDHARTHA CHIB ET AL.

become widespread across all areas of empirical economics. Previously

intractable problems are being solved and more ﬂexible models are being

introduced. The purpose of this volume is to illustrate today’s scope and

diversity of Bayesian econometric applications, to review some of the recent

advances, and to highlight various aspects of Bayesian inference and

computations.

The book is divided into three parts. In addition to this introduction, Part I

contains papers by Arnold Zellner, and by Paolo Giordani and Robert Kohn.

In his paper ‘‘Bayesian Econometrics: Past, Present, and Future,’’ Arnold

Zellner reviews problems faced by the Federal Reserve System, as described

by its former chairman, Alan Greenspan, and links these problems to a

summary of past and current Bayesian activity. Some key contributions to the

development of Bayesian econometrics are highlighted. Future research

directions are discussed with a view to improving current econometric

models, methods, and applications of them.

The other paper in Part I is a general one on a computational strategy for

improving MCMC. Under the title ‘‘Bayesian Inference using Adaptive

Sampling,’’ Paolo Giordani and Robert Kohn discuss simulation-based

Bayesian inference methods that draw on information from previous samples

to build the proposal distributions in a given family of distributions. The

article covers approaches along these lines and the intuition behind some of

the theory for proving that the procedures work. They also discuss strategies

for making adaptive sampling more effective and provide illustrations for

variable selection in the linear regression model and for time-series models

subject to interventions.

2. MICROECONOMETRIC MODELING

Part II of the book, entitled ‘‘Microeconometric Modeling’’ contains

applications that use cross-section or panel data. The paper by Murat K.

Munkin and Pravin K. Trivedi, ‘‘A Bayesian Analysis of the OPES Model

with a Nonparametric Component: An Application to Dental Insurance and

Dental Care,’’ is a good example of how Bayesian methods are increasingly

being used in important empirical work. The empirical focus is on the impact

of dental insurance on the use of dental services. Addressing this issue is

complicated by the potential endogeneity of insurance uptake and the fact

that insurance uptake may depend on explanatory variables in a nonlinear

fashion. The authors develop an appropriate model which addresses both

these issues and carry out an empirical analysis which ﬁnds strong evidence

Bayesian Econometrics: An Introduction

5

that having dental insurance encourages use of dentists, but also of adverse

selection into the insured state.

MCMC simulation techniques are particularly powerful in discrete-data

models with latent variable representations. In their paper ‘‘Fitting and

Comparison of Models for Multivariate Ordinal Outcomes,’’ Ivan Jeliazkov,

Jennifer Graves, and Mark Kutzbach review several alternative modeling

and identiﬁcation schemes for ordinal data models and evaluate how each

aids or hampers estimation using MCMC. Model comparison via marginal

likelihoods and an analysis of the effects of covariates on category probabilities is considered for each parameterization. The methods are applied to

examples in educational attainment, voter opinions, and consumers’ reliance

on alternative sources of medical information.

In ‘‘Intra-Household Allocation and Consumption of WIC-Approved

Foods: A Bayesian Approach,’’ Ariun Ishdorj, Helen H. Jensen, and Justin

Tobias consider the Special Supplemental Nutrition Program for Women,

Infants, and Children (WIC) that aims to provide food, nutrition education,

and other services to at-risk, low-income children and pregnant, breastfeeding, and postpartum women. They assess the extent to which the WIC

program improves the nutritional outcomes of WIC families as a whole,

including the targeted and nontargeted individuals within the household.

This question is considered under the possibility that participation in the

program (which is voluntary) is endogenous. They develop an appropriate

treatment–response model and conclude that WIC participation does not

lead to increased levels of calcium intake from milk.

A second paper that illustrates the use of Bayesian techniques for analyzing

treatment–response problems is that by Siddhartha Chib and Liana Jacobi.

In their paper ‘‘Causal Effects from Panel Data in Randomized Experiments

with Partial Compliance,’’ the authors describe how to calculate the causal

impacts from a training program when noncompliance exists in the training

arm. Two primary models are considered, with one model including a

random effects speciﬁcation. Prior elicitation is carefully done by simulating

from a prior predictive density on outcomes, using a hold out sample.

Estimation and model comparison are considered in detail. The methods are

employed to assess the impact of a job training program on mental health

scores.

Basic equilibrium job search models often yield wage densities that do not

accord well with empirical regularities. When extensions to basic models are

made and analyzed using kernel-smoothed nonparametric forms, it is difﬁcult

to assess these extensions via model comparisons. In ‘‘Parametric and

Nonparametric Inference in Equilibrium Job Search Models,’’ Gary Koop

6

SIDDHARTHA CHIB ET AL.

develops Bayesian parametric and nonparametric methods that are comparable to those in the existing non-Bayesian literature. He then shows how

Bayesian methods can be used to compare the different parametric and

nonparametric equilibrium search models in a statistically rigorous sense.

In the paper ‘‘Do Subsidies Drive Productivity? A Cross-Country Analysis

of Nordic Dairy Farms,’’ Nadine McCloud and Subal C. Kumbhakar

develop a Bayesian hierarchical model of farm production which allows for

the calculation of input productivity, efﬁciency, and technical change. The

key research questions relate to whether and how these are inﬂuenced by

subsidies. Using a large panel of Nordic dairy farms, they ﬁnd that subsidies

drive productivity through technical efﬁciency and input elasticities,

although the magnitude of these effects differs across countries.

The richness of available data and the scope for building ﬂexible models

makes marketing a popular area for Bayesian applications. In ‘‘Semiparametric Bayesian Estimation of Random Coefﬁcients Discrete Choice

Models,’’ Sylvie Tchumtchoua and Dipak K. Dey propose a semiparametric

Bayesian framework for the analysis of random coefﬁcients discrete choice

models that can be applied to both individual as well as aggregate data.

Heterogeneity is modeled using a Dirichlet process prior which (importantly)

varies with consumer characteristics through covariates. The authors employ

a MCMC algorithm for ﬁtting their model, and illustrate the methodology

using a household level panel dataset of peanut butter purchases, and

supermarket chain level data for 31 ready-to-eat breakfast cereals brands.

When diffuse priors are used to estimate simultaneous equation models,

the resulting posterior density can possess inﬁnite asymptotes at points of

local nonidentiﬁcation. Kleibergen and Zivot (2003) introduced a prior to

overcome this problem in the context of a restricted reduced form

speciﬁcation, and investigated the relationship between the resulting

Bayesian estimators and their classical counterparts. Arto Luoma and Jani

Luoto, in their paper ‘‘Bayesian Two-Stage Regression with Parametric

Heteroscedasticity,’’ extend the analysis of Kleibergen and Zivot to a

simultaneous equation model with unequal error variances. They apply their

techniques to a cross-country Cobb–Douglas production function.

3. TIME-SERIES MODELING

Part III of the volume is devoted to models and applications that use timeseries data. The ﬁrst paper in this part is ‘‘Bayesian Near-Boundary Analysis

in Basic Macroeconomic Time-Series Models’’ by Michiel D. de Pooter,

Bayesian Econometrics: An Introduction

7

Francesco Ravazzolo, Rene Segers, and Herman K. van Dijk. The boundary

issues considered by these authors are similar to that encountered by Arto

Luoma and Jani Luoto in their paper. There are a number of models where

the use of particular types of noninformative priors can lead to improper

posterior densities with estimation breaking down at boundary values of

parameters. The circumstances under which such problems arise, and how

the problems can be solved using regularizing or truncated priors, are

examined in detail by de Pooter et al. in the context of dynamic linear

regression models, autoregressive and error correction models, instrumental

variable models, variance component models, and state space models.

Analytical, graphical, and empirical results using U.S. macroeconomic data

are presented.

In his paper ‘‘Forecasting in Vector Autoregressions with Many

Predictors,’’ Dimitris Korobilis introduces Bayesian model selection methods

in a VAR setting, focusing on the problem of drawing inferences from a

dataset with a very large number of potential predictors. A stochastic search

variable selection algorithm is used to implement Bayesian model selection.

An empirical application using 124 potential predictors to forecast eight U.S.

macroeconomic variables is included to demonstrate the methodology.

Results indicate an improvement in forecasting accuracy over model

selection based on the Bayesian Information Criteria.

In ‘‘Bayesian Inference in a Cointegrating Panel Data Model,’’ Gary

Koop, Robert Leon-Gonzalez, and Rodney Strachan focus on cointegration

in the context of a cointegrating panel data model. Their approach allows

both short-run dynamics and the cointegrating rank to vary across crosssectional units. In addition to an uninformative prior, they propose an

informative prior with ‘‘soft homogeneity’’ restrictions. This informative

prior can be used to include information from economic theory that crosssectional units are likely to share the same cointegrating rank without forcing

that assumption on the data. Empirical applications using simulated data

and a long-run model for bilateral exchange rates are used to demonstrate

the methodology.

Cointegration is also considered by Deborah Gefang who develops tests of

purchasing power parity (PPP) within an exponential smooth transition

(ESVECM) framework. The Bayesian approach offers a substantial

methodological advantage in this application because the Gibbs sampling

scheme is not affected by the multi-mode problem created by nuisance

parameters. Results based on Bayesian model averaging and Bayesian model

selection ﬁnd evidence that PPP holds between the United States and each of

the remaining G7 countries.

8

SIDDHARTHA CHIB ET AL.

‘‘Bayesian Forecast Combination for VAR Models’’ by Michael K.

Andersson and Sune Karlsson addresses the issue of how to forecast a

variable (or variables) of interest (e.g., GDP) when there is uncertainty about

the dimension of the VAR and uncertainty about which set of explanatory

variables should be used. This uncertainty leads to a huge set of models. The

authors do model averaging over the resulting high-dimensional model space

using predictive likelihoods as weights. For forecast horizons greater than

one, the predictive likelihoods will not have analytical forms and the authors

develop a simulation method for estimating them. An empirical analysis

involving U.S. GDP shows the beneﬁts of their approach.

In ‘‘Bayesian Inference on Time-Varying Proportions,’’ William J.

McCausland and Brahim Lgui derive a highly efﬁcient algorithm for

simulating the states in state space models where the dependent variables are

proportions. The authors argue in favor of a model which is parameterized

such that the measurement equation has the proportions (conditional on the

states) following a Dirichlet distribution, but the state equation is a standard

linear Gaussian one. The authors develop a Metropolis–Hastings algorithm

which draws states as a block from a multivariate Gaussian proposal

distribution. Extensive empirical evidence indicates that their approach

works well and, in particular, is very efﬁcient.

Christopher J. O’Donnell and Vanessa Rayner use Bayesian methodology

to impose inequality restrictions on ARCH and GARCH models in their

paper ‘‘Imposing Stationarity Constraints on the Parameters of ARCH and

GARCH Models.’’ Bayesian model averaging is used to resolve uncertainty

with regard to model selection. The authors apply the methodology to data

from the London Metals Exchange and ﬁnd that results are generally

insensitive to the imposition of inequality restrictions.

In ‘‘Bayesian Model Selection for Heteroskedastic Models,’’ Cathy W. S.

Chen, Richard Gerlach, and Mike K. P. So discuss Bayesian model selection

for a wide variety of ﬁnancial volatility models that exhibit asymmetries (e.g.,

threshold GARCH models). Model selection problems are complicated by

the fact that there are many contending models and marginal likelihood

calculation can be difﬁcult. They discuss this problem in an empirical

application involving daily data from three Asian stock markets and

calculate the empirical support for their competing models.

Using a scale mixture of uniform densities representation of the Student-t

density, S. T. Boris Choy, Wai-yin Wan, and Chun-man Chan provide a

Bayesian analysis of a Student-t stochastic volatility model in ‘‘Bayesian

Student-t Stochastic Volatility Models via Scale Mixtures.’’ They develop a

Gibbs sampler for their model and show how their approach can be extended

Bayesian Econometrics: An Introduction

9

to the important class of Student-t stochastic volatility models with leverage.

The different models are ﬁt to returns on exchange rates of the Australian

dollar against 10 currencies.

In ‘‘Bayesian Analysis of the Consumption CAPM,’’ Veni Arakelian and

Efthymios G. Tsionas show that Labadie’s (1989) solution to the CAPM can

be applied to obtain a closed form solution and to provide a traditional

econometric interpretation. They then apply Bayesian inference to both

simulated data and the Mehra and Prescott (1985) dataset. Results generally

conform to theory, but also reveal asymmetric marginal densities for key

parameters. The asymmetry suggests that techniques such as generalized

method of moments, which rely on asymptotical approximations, may be

unreliable.

REFERENCES

Kleibergen, F., & Zivot, E. (2003). Bayesian and classical approaches to instrumental variable

regression. Journal of Econometrics, 114, 29–72.

Labadie, P. (1989). Stochastic inﬂation and the equity premium. Journal of Monetary

Economics, 24, 195–205.

Mehra, R., & Prescott, E. C. (1985). The equity premium: A puzzle. Journal of Monetary

Economics, 15, 145–162.

BAYESIAN ECONOMETRICS: PAST,

PRESENT, AND FUTURE

Arnold Zellner

ABSTRACT

After brieﬂy reviewing the past history of Bayesian econometrics and Alan

Greenspan’s (2004) recent description of his use of Bayesian methods in

managing policy-making risk, some of the issues and needs that he

mentions are discussed and linked to past and present Bayesian

econometric research. Then a review of some recent Bayesian econometric

research and needs is presented. Finally, some thoughts are presented that

relate to the future of Bayesian econometrics.

1. INTRODUCTION

In the ﬁrst two sentences of her paper, ‘‘Bayesian Econometrics, The First

Twenty Years,’’ Qin (1996) wrote, ‘‘Bayesian econometrics has been a

controversial area in the development of econometric methodology. Although

the Bayesian approach has been constantly dismissed by many mainstream

econometricians for its subjectivism, Bayesian methods have been adopted

widely in current econometric research’’ (p. 500). This was written more than

10 years ago. Now more mainstream econometricians and many others have

adopted the Bayesian approach and are using it to solve a broad range of

Bayesian Econometrics

Advances in Econometrics, Volume 23, 11–60

Copyright r 2008 by Emerald Group Publishing Limited

All rights of reproduction in any form reserved

ISSN: 0731-9053/doi:10.1016/S0731-9053(08)23001-X

11

12

ARNOLD ZELLNER

econometric problems in line with my forecast in Zellner (1974), ‘‘Further, it

must be recognized that the B approach is in a stage of rapid development

with work going ahead on many new problems and applications. While this is

recognized, it does not seem overly risky to conclude that the B approach,

which already has had some impact on econometric work, will have a much

more powerful inﬂuence in the next few years’’ (p. 54).

See also, Zellner (1981, 1988b, 1991, 2006) for more on the past, present,

and future of Bayesian econometrics in which it is emphasized that all

econometricians use and misuse prior information, subjectively, objectively,

or otherwise. And it has been pointed out that Bayesian econometricians

learn using an explicit model, Bayes’ Theorem that allows prior information

to be employed in a formal and reproducible manner whereas non-Bayesian

econometricians learn in an informal, subjective manner. For empirical

evidence on the rapid growth of Bayesian publications over the years in

economics and other ﬁelds that will be discussed below see Poirier (1989,

1992, 2004) and Poirier (1991) for an interesting set of Bayesian empirical

papers dealing with problems in economics and ﬁnance.

In the early 1990s, both the International Society for Bayesian Analysis

(http://www.bayesian.org) and the Section on Bayesian Statistical Science of

the American Statistical Association (http://www.amstat.org) were formed

and have been very active and successful in encouraging the growth of

Bayesian theoretical and applied research and publications. Similarly, the

NBER-NSF Seminar on Bayesian Inference in Econometrics and Statistics

(SBIES) that commenced operation in 1970, has been effective for many years

in sponsoring research meetings, publishing a number of Bayesian books and

actively supporting the creation of ISBA and SBSS in the early 1990s. In

Berry, Chaloner, and Geweke (1996), some history of the SBIES and a large

number of Bayesian research papers are presented. Also, under the current

leadership of Sid Chib, very productive meetings of this seminar in 2004 and

2005 have been held that were organized by him and John Geweke. In August

2006, the European–Japanese Bayesian Workshop held a meeting in Vienna

organized by Wolfgang Polasek that had a very interesting program. In 2005,

the Indian Bayesian Society and the Indian Bayesian Chapter of ISBA had an

international Bayesian meeting at Varanasi with many of the papers

presented that have appeared in a conference volume. In September 2006, a

Bayesian research meeting was held at the Royal Bank of Sweden, organized

by Mattias Villani that attracted leading Bayesian econometricians from all

over the world to present reports on their current work on Bayesian

econometric methodology. And now, this Advances in Econometrics volume

features additional valuable Bayesian econometric research. And last, but not

Bayesian Econometrics: Past, Present, and Future

13

least, the International Society for Bayesian Analysis has commenced

publication of an online Bayesian journal called Bayesian Analysis; see

http://www.bayesian.org for more information about this journal with

R. Kass the founding editor and listings of articles for several years that

are downloadable. These and many more Bayesian activities that have taken

place over the years attest to the growth and vitality of Bayesian analysis in

many sciences, industries, and governments worldwide.

1.1. An Example of Bayesian Monetary Policy-Making

As an example of extremely important work involving the use of Bayesian

methodology and analysis, Alan Greenspan, former Chairman of the U.S.

Federal Reserve System presented an invited paper, ‘‘Risk and Uncertainty in

Monetary Policy’’ at the 2004 Meeting of the American Economic

Association that was published in the American Economic Review in 2004

along with very knowledgeable discussion by Martin Feldstein, Harvard

Professor of Economics and President of the National Bureau of Economic

Research, Mervyn King of the Bank of England, and Professor Janet L.

Yellen of the Haas School of Business, University of California, Berkeley.

The paper is notable in that it presents a comprehensive description of the

ways in which he approached and solved monetary policy problems ‘‘ . . . from

the perspective of someone who has been in the policy trenches’’ (p. 33).

Greenspan’s account should be of interest to Bayesians econometricians

and many others since he states, ‘‘In essence, the risk management approach

to policymaking is an application of Bayesian decision-making’’ (p. 37). In

addition, he writes, ‘‘Our problem is not, as is sometimes alleged, the

complexity of our policy-making process, but the far greater complexity of a

world economy whose underlying linkages appear to be continuously evolving. Our response to that continuous evolution has been disciplined by the

Bayesian type of decision-making in which we have been engaged’’ (p. 39).

Feldstein (2004), after providing an excellent review of Greenspan’s

successful policy-making in the past wrote, ‘‘Chairman Greenspan emphasized that dealing with uncertainty is the essence of making monetary policy

(see also Feldstein, 2002). The key to what he called the risk-management

approach to monetary policy is the Bayesian theory of decision-making’’

(p. 42). After providing a brief, knowledgeable description of Bayesian

decision theory, Feldstein provides the following example to illustrate a case

of asymmetric loss in connection with a person making a decision whether to

carry an umbrella when the probability of rain is not high. ‘‘If he carries the

14

ARNOLD ZELLNER

umbrella and it does not rain, he is mildly inconvenienced. But if he does not

carry the umbrella and it rains, he will suffer getting wet. A good Bayesian

ﬁnds himself carrying an umbrella on many days when it does not rain. The

policy actions of the past year were very much in this spirit. The Fed cut the

interest rate to 1 percent to prevent the low-probability outcome of spiraling

deﬂation because it regarded that outcome as potentially very damaging

while the alternative possible outcome of a rise of the inﬂation rate from 1.5

percent to 2.5 percent was deemed less damaging and more easily reversed’’

(p. 42).

Mervyn King of the Bank of England commented knowingly about model

quality and policy-making, ‘‘Greenspan suggests that the risk-management

approach is an application of Bayesian decision-making when there is

uncertainty about the true model of the economy. Policy that is optimal in

one particular model of the economy may not be ‘robust’ across a class of

other models. In fact, it may lead to a very bad outcome should an

alternative model turn out to be true . . . Of course, although such an

approach is sensible, it is still vulnerable to policymakers giving excessive

weight to misleading models of the economy. . . . But, in the end, there is no

escaping the need to make judgments about which models are more plausible

than others’’ (pp. 42–43). These are indeed very thoughtful remarks about

problems of model uncertainty in making policy but do not recognize that

just as with Feldstein’s umbrella example above, a Bayesian analysis can

utilize posterior probabilities associated with alternative models that reﬂect

the quality of past performance that have been shown to be useful in

producing useful combined forecasts and probably will be helpful in dealing

with model uncertainty in policy-making.

1.2. Greenspan’s Policy-Making Problems

Below, I list and label important problems that Greenspan mentioned in

connection with his successful policy-making over the years that reveal his

deep understanding of both obvious and very subtle problems associated

with model-building, economic analyses, forecasting, and policy-making.

1. Structural changes: For example, ‘‘ . . . increased political support for

stable prices, globalization which unleashed powerful new forces of

competition, and an acceleration of productivity which at least for a time

held down cost pressures’’ (p. 33). ‘‘I believe that we at the Fed, to our

credit, did gradually come to recognize the structural economic changes

that we were living through and accordingly altered our understanding

Bayesian Econometrics: Past, Present, and Future

2.

3.

4.

5.

6.

15

of the key parameters of the economic system and our policy stance . . . .

But as we lived through it, there was much uncertainty about the

evolving structure of the economy and about the inﬂuence of monetary

policy’’ (p. 33).

Forecasting: ‘‘In recognition of the lag in monetary policy’s impact on

economic activity, a preemptive response to the potential for building

inﬂationary pressures was made an important feature of policy. As a

consequence, this approach elevated forecasting to an even more

prominent place in policy deliberations’’ (p. 33).

Unintended consequences: ‘‘Perhaps the greatest irony of the past decade

is that the gradually unfolding success against inﬂation may well have

contributed to the stock price bubble of the latter part of the

1990s . . . The sharp rise in stock prices and their subsequent fall were,

thus, an especial challenge to the Federal Reserve’’ (p. 35).

‘‘The notion that a well-timed incremental tightening could have been

calibrated to prevent the late 1990s bubble while preserving economic

stability is almost surely an illusion. Instead of trying to contain a

putative bubble by drastic actions with largely unpredictable consequences, we chose . . . to focus on policies to mitigate the fallout when it

occurs and, hopefully, ease the transition to the next expansion’’ (p. 36).

Uncertainty: ‘‘The Federal Reserve’s experiences over the past two

decades make it clear that uncertainty is not just a pervasive feature of

the monetary landscape; it is the deﬁning characteristic of that

landscape. The term ‘‘uncertainty’’ is meant here to encompass both

‘Knightian uncertainty,’ in which the probability distribution of

outcomes is unknown, and ‘risk,’ in which uncertainty of outcomes is

delimited by a known probability distribution. In practice, one is never

quite sure what type of uncertainty one is dealing with in real time, and it

may be best to think of a continuum ranging from well-deﬁned risks to

the truly unknown’’ (pp. 36–37).

Risk management: ‘‘As a consequence, the conduct of monetary policy in

the United States has come to involve, at its core, crucial elements of risk

management. This conceptual framework emphasizes understanding as

much as possible the many sources of risk and uncertainty that

policymakers face, quantifying those risks, when possible, and assessing

costs associated with each of the risks. In essence, the risk-management

approach to monetary policymaking is an application of Bayesian

decision-making’’ (p. 37).

Objectives: ‘‘This [risk management] framework also entails devising, in

light of those risks, a strategy for policy directed at maximizing the

16

7.

8.

9.

10.

11.

ARNOLD ZELLNER

probabilities of achieving over time our goals of price stability and the

maximum sustainable economic growth that we associate with it’’ (p. 37).

Expert opinion: ‘‘In designing strategies to meet our policy objectives, we

have drawn on the work of analysts, both inside and outside the Fed,

who over the past half century have devoted much effort to improving

our understanding of the economy and its monetary transmission

mechanism’’ (p. 37).

Model uncertainty: ‘‘A critical result [of efforts to improve our

understanding of the economy and its monetary transmission mechanism] has been the identiﬁcation of a relatively small set of key

relationships that, taken together, provide a useful approximation of

our economy’s dynamics. Such an approximation underlies the statistical

models that we at the Federal Reserve employ to assess the likely

inﬂuence of our policy decisions.

However, despite extensive efforts to capture and quantify what we

perceive as the key macroeconomic relationships, our knowledge about

many of the important linkages is far from complete and, in all likelihood

will always remain so. Every model, no matter how detailed or how well

designed, conceptually and empirically, is a vastly simpliﬁed representation of the world that we experience with all its intricacies on a day-today basis’’ (p. 37).

Loss structures: ‘‘Given our inevitably incomplete knowledge about key

structural aspects of an ever-changing economy and the sometimes

asymmetric costs or beneﬁts of particular outcomes, a central bank

needs to consider not only the most likely future path for the economy,

but also the distribution of possible outcomes about that path. The

decision-makers then need to reach a judgment about the probabilities,

costs and beneﬁts of the various possible outcomes under alternative

choices for policy’’ (p. 37).

Robustness of policy: ‘‘In general, different policies will exhibit different

degrees of robustness with respect to the true underlying structure of the

economy’’ (p. 37).

Cost–beneﬁt analysis: ‘‘As this episode illustrates, policy practitioners

operating under a risk-management paradigm may, at times, be led to

undertake actions intended to provide insurance against [low probability] especially adverse outcomes . . . . The product of a lowprobability event and a potentially severe outcome was judged a more

serious threat to economic performance than the higher inﬂation that

might ensue in the more probable scenario’’ (p. 37).

Bayesian Econometrics: Past, Present, and Future

17

12. Knightian uncertainty: ‘‘When confronted with uncertainty, especially

Knightian uncertainty, human beings invariably attempt to disengage

from medium- to long-term commitments in favor of safety and

liquidity. Because economies, of necessity, are net long (that is, have net

real assets) attempts to ﬂee these assets causes prices of equity assets to

fall, in some cases dramatically . . . The immediate response on the part

of the central bank to such ﬁnancial implosions must be to inject large

quantities of liquidity . . . ’’ (p. 38).

13. Parameters (ﬁxed- and time-varying): ‘‘The economic world in which we

function is best described by a structure whose parameters are

continuously changing. . . . We often ﬁt simple models [with ﬁxed

parameters] only because we cannot estimate a continuously changing

set of parameters without vastly more observations than are currently

available to us’’ (p. 38).

14. Multiple risks: ‘‘In pursuing a risk-management approach to policy, we

must confront the fact that only a limited number of risks can be

quantiﬁed with any conﬁdence . . . . Policy makers often have to act, or

choose not to act, even though we may not fully understand the full

range of possible outcomes, let alone each possible outcome’s likelihood. As a result, risk management often involves signiﬁcant judgment

as we evaluate the risks of different events and the probability that our

actions will alter those risks’’ (p. 38).

15. Policy rules: ‘‘For such judgment [mentioned above], policymakers have

needed to reach beyond models to broader, though less mathematically

precise, hypotheses about how the world works. For example, inferences

about how market participants and, hence, the economy might respond

to a monetary policy initiative may need to be drawn from evidence

about past behavior during a period only roughly comparable to the

current situation.

Some critics have argued that such an approach to policy is too

undisciplined – judgmental, seemingly discretionary, and difﬁcult to

explain. The Federal Reserve, they conclude, should attempt to be more

formal in its operations by tying its actions, solely, on the weaker

paradigm, largely, to the prescriptions of a simple policy rule. Indeed,

rules that relate the setting of the federal funds rate to the deviations of

output and inﬂation from their respective targets, in some conﬁgurations,

do seem to capture the broad contours of what we did over the past

decade and a half. And the prescriptions of formal rules can, in fact,

serve as helpful adjuncts to policy, as many of the proponents of these

18

16.

17.

18.

19.

ARNOLD ZELLNER

rules have suggested. But at crucial points, like those of our recent policy

history (the stock market crash of 1987, the crises of 1997–1998, and the

events that followed September, 2001), simple rules will be inadequate as

either descriptions or prescriptions for policy. Moreover, such rules

suffer from much of the same ﬁxed-coefﬁcient difﬁculties we have with

our large-scale models’’ (pp. 38–39).

Forecasting: ‘‘While all, no doubt, would prefer that it were otherwise,

there is no way to dismiss what has to be obvious to every monetary

policymaker. The success of monetary policy depends importantly on

the quality of forecasting. The ability to gauge risks implies some

judgment about how current economic imbalances will ultimately play

out . . . . Thus, both econometric and qualitative models need to be

continually tested’’ (p. 39).

Monetary policy: ‘‘In practice, most central banks, at least those not

bound by an exchange-rate peg, behave in roughly the same way. They

seek price stability as their long term goal and, accounting for the lag in

monetary policy, calibrate the setting of the policy rate accordingly. . . .

All banks ease when economic conditions ease and tighten when

economic conditions tighten, even if in differing degrees, regardless of

whether they are guided by formal or informal inﬂation targets’’ (p. 39).

Uncontrolled outcomes and targets: ‘‘Most prominent is the appropriate

role of asset prices in policy. In addition to the narrower issue of

product price stability, asset prices will remain high on the research

agenda of central banks for years to come. . . . There is little dispute

that the prices of stocks, bonds, homes, real estate, and exchange rates

affect GDP. But most central banks have chosen, at least to date, to

view asset prices not as targets of policy, but as economic variables to be

considered through the prism of the policy’s ultimate objective’’ (p. 40).

Performance rating: ‘‘We were fortunate . . . to have worked in a

particularly favorable structural and political environment. But we trust

that monetary policy has meaningfully contributed to the impressive

performance of our economy in recent decades’’ (p. 40). Further

evaluation of current monetary policies dealing with the 2007–2008

credit crisis is an important issue.

1.3. Greenspan’s Problems and Econometric Research

It is of interest to relate Greenspan’s problem areas to current and past

Bayesian econometric research. In econometric research, along with other

19

Bayesian Econometrics: Past, Present, and Future

scientiﬁc research, three main areas of activity have been recognized,

namely, deduction, induction, and reduction, see Jeffreys (1957, 1939 [1998])

and Zellner (1985, pp. 3–10 and 1996, Chapter 1) for discussions of these

topics and references to the huge literature on the deﬁnitions and other

aspects of these research areas. Brieﬂy, deduction involves use of logic and

mathematics to prove propositions given certain assumptions. Induction

involves development and use of measurement, description, estimation,

testing, prediction, and decision-making procedures, while reduction

involves creating new models and methods that are helpful in explaining

the past, predicting as yet unobserved outcomes at various places and/or

times and in solving private and public decision problems.

While much more can be and has been said about deduction, induction,

and reduction, most will agree about the difﬁculty of producing good new or

improved models that work well in explanation, prediction, and decisionmaking. However, as we improve our understanding of these three areas and

their interrelations in past and current work and engage in more empirical

predictive and other testing of alternative models and methods, testing that

is much needed in evaluation of alternative macroeconomic models, as

emphasized by Christ (1951, 1975), Fair (1992), and many others, more

rapid progress will undoubtedly result.

A categorization of Greenspan’s problems by their nature is shown in

Table 1.

It is seen that many of Greenspan’s problems have a deductive or

theoretical aspect to them but, as recognized in the literature, deduction

alone is inadequate for scientiﬁc work for a variety of reasons, perhaps best

summarized by the old adage, ‘‘Logical proof does not imply complete

certainty of outcomes,’’ as widely appreciated in the philosophical literature

and elsewhere. Perhaps, the most striking aspect of Table 1 is the large

number of entries in category III, reduction. Economic theorists, econometricians, and others have to get busy producing new models and methods that

are effective in helping to solve former Chairman Greenspan’s and now

Chairman Bernanke’s problems. See Hadamard (1945) for the results of a

Table 1.

Categories

(I) Deduction

(II) Induction

(III) Reduction

Tabulation of Greenspan’s Problems Listed Above.

Problem Numbers

3, 4, 6, 9, 10, 11, 12, 13, 14, 16, 17, 19

2, 3, 6, 7, 8, 9, 10, 11, 13, 14, 15, 16, 17, 18, 19

1, 4, 8, 12, 13, 15, 16, 18

20

ARNOLD ZELLNER

survey of mathematicians that provides information on how major breakthroughs in mathematics occurred and tips on how to create new theories in

mathematics that may also be helpful in reductive econometric work as

discussed in Zellner (1985, pp. 8–10). Also, in Zellner and Palm (2004) some

methods for creating new econometric models and checking old econometric

models and applications of them by a number of researchers are presented

that may be helpful in the production of new econometric models that

perform well in explanation, prediction, and policy-making. More will be

said about this reductive problem area below.

1.4. Overview of Paper

With this in the way of an introduction to some current problems facing us,

in Sections 2 and 3 we shall review some early and recent work in Bayesian

econometrics and relate it to some of the problems mentioned by Chairman

Greenspan and consider future possible developments in Bayesian econometrics in Section 4.

2. THE PAST

2.1. Early Bayesian Econometrics

As is the case with many others who commenced study of econometrics in

the 1950s, in my graduate econometrics courses at the University of

California at Berkeley there was no mention of Bayesian topics except in a

game theory course that I took with David Blackwell (who many years later

introduced an elementary Bayesian statistics course at Berkeley using Berry’s

(1996) text). Also, there was no mention of Bayesian analysis in Tintner’s

(1952) popular text or in most Cowles Commission publications. Although,

in Klein’s Textbook of Econometrics (1953, p. 62) some discussion of

Bayesian decision theory along with a reservation about prior distributions

appeared that he apparently abandoned later in an invited paper, ‘‘Whither

Econometrics?’’ published in JASA in which Klein (1971) wrote, ‘‘Bayesian

methods attempt to treat a priori information in a systematic way. As a pure

and passive forecaster of econometric methodology I can see a great deal of

future research effort being channeled in that direction. Systematic ways of

introducing a priori information are to be desired’’ (p. 420). Also Theil’s

(1978) econometrics text included a chapter titled, ‘‘Bayesian Inference and

Bayesian Econometrics: Past, Present, and Future

21

Rational Random Behavior’’ in which he explained Bayes’ theorem and

provided some interesting applications of it. However, he expressed strong

reservations about improper prior distributions and also wrote, ‘‘The

Bayesian approach is itself a matter of considerable controversy. This is not

surprising, given that the approach takes a fundamentally different view of

the nature of the parameters by treating them as random variables’’ (p. 254).

There is no question but that Klein’s forecast regarding future

econometric methodology, presented above, has been quite accurate. Much

past and current Bayesian research is indeed focused on how to formulate

and use prior distributions and models that incorporate ‘‘a priori

information’’ in analyses of a wide range of estimation, prediction, and

control problems with applications in many ﬁelds using ﬁxed and random

parameter models. See the early papers by Dre`ze (1962), Rothenberg (1963),

and Zellner (1965) presented at the ﬁrst World Congress of the Econometric

Society in Rome, 1965 for some Bayesian results for analyzing the important

simultaneous equations model. In my paper, I presented some numerical

integration results, obtained using ‘‘old-fashioned’’ numerical integration

methods that were of great interest to Malinvaud whose well-known 1964

(translated from French into English in 1966) econometrics text, along with

many others, made no mention of Bayes and Bayesian methods, nor of the

early Bayesian papers that Qin (1996) cites: ‘‘The early 1960s saw pioneering

Bayesian applications in econometrics. These included published works by

Fisher (1962), Hildreth (1963), Tiao and Zellner (1964, 1965) and Zellner

and Tiao (1964) and unpublished works by Dre`ze (1962) and Rothenberg

(1963)’’ (pp. 503–504). See also Chetty (1968) for an early Bayesian analysis

of macroeconomic models introduced by Haavelmo.

In spite of these and a number of other theoretical and applied Bayesian

publications that appeared in the 1960s and early 1970s, in the 1974,

completely revised, 2nd edition of Klein’s Textbook Of Econometrics, he

wrote:

Bayes’ theorem gives a logical method of making probability inferences if the a priori

probabilities are known. They seldom are known and this is the objection to the use of

this theorem for most problems of statistical inference. A major contribution of decision

function theory [that he ably describes in this chapter of his book] is to show the relation

of various inferences to Bayes’ type solutions. The beauty of the theory is that it includes

hypothesis testing and estimation methods as special cases of a more general approach to

inference. (p. 64)

It is clear that Klein, along with Dre`ze, Leamer, and some other econometricians, had a deep understanding of the decision theoretic approach to

Bayesian statistical inference that Ramsey, Savage, Friedman, Raiffa,

Series Editors: Thomas B. Fomby and R. Carter Hill

Recent Volumes:

Volume 15:

Nonstationary Panels, Panel Cointegration,

and Dynamic Panels, Edited by Badi Baltagi

Volume 16:

Econometric Models in Marketing, Edited

by P. H. Franses and A. L. Montgomery

Volume 17:

Maximum Likelihood Estimation of

Misspeciﬁed Models: Twenty Years Later,

Edited by Thomas B. Fomby and

R. Carter Hill

Volume 18:

Spatial and Spatiotemporal Econometrics,

Edited by J. P. LeSage and R. Kelley Pace

Volume 19:

Applications of Artiﬁcial Intelligence in

Finance and Economics, Edited by

J. M. Binner, G. Kendall and S. H. Chen

Volume 20A:

Econometric Analysis of Financial and

Economic Time Series, Edited by

Dek Terrell and Thomas B. Fomby

Volume 20B:

Econometric Analysis of Financial and

Economic Time Series, Edited by

Thomas B. Fomby and Dek Terrell

Volume 21:

Modelling and Evaluating Treatment Effects

in Econometrics, Edited by Daniel L. Millimet, Jeffrey A. Smith and Edward J. Vytlacil

Volume 22:

Econometrics and Risk Management,

Edited by Thomas B. Fomby, Knut Solna

and Jean-Pierre Fouque

ADVANCES IN ECONOMETRICS

VOLUME 23

BAYESIAN

ECONOMETRICS

EDITED BY

SIDDHARTHA CHIB

Olin Business School, Washington University

WILLIAM GRIFFITHS

Department of Economics, University of Melbourne

GARY KOOP

Department of Economics, University of Strathclyde

DEK TERRELL

Department of Economics, Louisiana State University

United Kingdom – North America – Japan

India – Malaysia – China

JAI Press is an imprint of Emerald Group Publishing Limited

Howard House, Wagon Lane, Bingley BD16 1WA, UK

First edition 2008

Copyright r 2008 Emerald Group Publishing Limited

Reprints and permission service

Contact: booksandseries@emeraldinsight.com

No part of this book may be reproduced, stored in a retrieval system, transmitted in any

form or by any means electronic, mechanical, photocopying, recording or otherwise

without either the prior written permission of the publisher or a licence permitting

restricted copying issued in the UK by The Copyright Licensing Agency and in the USA

by The Copyright Clearance Center. No responsibility is accepted for the accuracy of

information contained in the text, illustrations or advertisements. The opinions expressed

in these chapters are not necessarily those of the Editor or the publisher.

British Library Cataloguing in Publication Data

A catalogue record for this book is available from the British Library

ISBN: 978-1-84855-308-8

ISSN: 0731-9053 (Series)

Awarded in recognition of

Emerald’s production

department’s adherence to

quality systems and processes

when preparing scholarly

journals for print

LIST OF CONTRIBUTORS

Michael K. Andersson

Sveriges Riksbank, Stockholm, Sweden

Veni Arakelian

Department of Economics, University of

Crete, Rethymno, Greece

Chun-man Chan

Hong Kong Community College,

Kowloon, Hong Kong, China

Cathy W. S. Chen

Department of Statistics, Feng Chia

University, Taiwan

Siddhartha Chib

Olin Business School, Washington

University, St. Louis, MO

S. T. Boris Choy

Discipline of Operations Management

and Econometrics, University of Sydney,

NSW, Australia

Michiel de Pooter

Division of International Finance,

Financial Markets, Board of Governors

of the Federal Reserve System,

Washington, DC

Dipak K. Dey

Department of Statistics, University of

Connecticut, Storrs, CT

Deborah Gefang

Department of Economics, University of

Leicester, Leicester, UK

Richard Gerlach

Discipline of Operations Management

and Econometrics, University of Sydney,

NSW, Australia

Paolo Giordani

Research Department, Sveriges

Riksbank, Stockholm, Sweden

Jennifer Graves

Department of Economics, University of

California, Irvine, CA

ix

x

LIST OF CONTRIBUTORS

William Grifﬁths

Department of Economics, University of

Melbourne, Vic., Australia

Ariun Ishdorj

Department of Economics, Iowa State

University, Ames, IA

Liana Jacobi

Department of Economics, University of

Melbourne, Vic., Australia

Ivan Jeliazkov

Department of Economics, University of

California, Irvine, CA

Helen H. Jensen

Department of Economics, Iowa State

University, Ames, IA

Swedish Business School, O¨rebo

University, O¨rebo, Sweden

Sune Karlsson

Robert Kohn

Department of Economics,

Australian School of Business,

University of New South Wales,

Sydney, Australia

Gary Koop

Department of Economics, University of

Strathclyde, Glasgow, UK

Dimitris Korobilis

Department of Economics, University of

Strathclyde, Glasgow, UK

Subal C. Kumbhakar

Department of Economics, State

University of New York, Binghamton,

NY

Mark Kutzbach

Department of Economics, University of

California, Irvine, CA

Roberto Leon-Gonzalez

National Graduate Institute for Policy

Studies (GRIPS), Tokyo, Japan

De´partement de Sciences E´conomiques,

Universite´ de Montre´al, CIREQ,

Canada

Brahim Lgui

Arto Luoma

Department of Mathematics and

Statistics, University of Tampere,

Tampere, Finland

xi

List of Contributors

Jani Luoto

William J. McCausland

School of Business and Economics,

University of Jyva¨skyla¨, Jyva¨skyla¨,

Finland

De´partement de Sciences E´conomiques,

Universite´ de Montre´al, CIREQ and

CIRANO, Montre´al, QC, Canada

Nadine McCloud

Department of Economics, The

University of the West Indies, Mona,

Kingston, Jamaica

Murat K. Munkin

Department of Economics, University of

South Florida, Tampa, FL

Christopher J. O’Donnell

School of Economics, University of

Queensland, Brisbane, Australia

Francesco Ravazzolo

Norges Bank, Oslo, Norway

Vanessa Rayner

School of Economics, University of

Queensland, Brisbane, Australia

Rene Segers

Tinbergen Institute and Econometric

Institute, Erasmus University

Rotterdam, Rotterdam, The

Netherlands

Mike K. P. So

Department of ISOM, Hong Kong

University of Science and Technology,

Kowloon, Hong Kong

Rodney Strachan

School of Economics, The University of

Queensland, Brisbane, Australia

Sylvie Tchumtchoua

Department of Statistics, University of

Connecticut, Storrs, CT

Dek Terrell

Department of Economics, Louisiana

State University, Baton Rouge, LA

Justin Tobias

Department of Economics, Purdue

University, West Lafayette, IN

Pravin K. Trivedi

Department of Economics, Wylie Hall,

Indiana University, Bloomington, IN

xii

LIST OF CONTRIBUTORS

Efthymios G. Tsionas

Department of Economics, Athens

University of Economics and Business,

Athens, Greece

Herman K. van Dijk

Tinbergen Institute and Econometric

Institute, Erasmus University

Rotterdam, Rotterdam, The

Netherlands

Wai-yin Wan

School of Mathematics and Statistics,

University of Sydney, NSW, Australia

Arnold Zellner

Graduate School of Business, University

of Chicago, Chicago, IL

BAYESIAN ECONOMETRICS: AN

INTRODUCTION

Siddhartha Chib, William Grifﬁths, Gary Koop and

Dek Terrell

ABSTRACT

Bayesian Econometrics is a volume in the series Advances in Econometrics

that illustrates the scope and diversity of modern Bayesian econometric

applications, reviews some recent advances in Bayesian econometrics, and

highlights many of the characteristics of Bayesian inference and

computations. This ﬁrst paper in the volume is the Editors’ introduction

in which we summarize the contributions of each of the papers.

1. INTRODUCTION

In 1996 two volumes of Advances in Econometrics were devoted to Bayesian

econometrics. One was on computational methods and applications and the

other on time-series applications. This was a time when Markov chain Monte

Carlo (MCMC) techniques, which have revolutionized applications of

Bayesian econometrics, had started to take hold. The adaptability of MCMC

to problems previously considered too difﬁcult was generating a revival of

interest in the Bayesian paradigm. Now, 12 years later, it is time for another

Advances volume on Bayesian econometrics. Use of Bayesian techniques has

Bayesian Econometrics

Advances in Econometrics, Volume 23, 3–9

Copyright r 2008 by Emerald Group Publishing Limited

All rights of reproduction in any form reserved

ISSN: 0731-9053/doi:10.1016/S0731-9053(08)23021-5

3

4

SIDDHARTHA CHIB ET AL.

become widespread across all areas of empirical economics. Previously

intractable problems are being solved and more ﬂexible models are being

introduced. The purpose of this volume is to illustrate today’s scope and

diversity of Bayesian econometric applications, to review some of the recent

advances, and to highlight various aspects of Bayesian inference and

computations.

The book is divided into three parts. In addition to this introduction, Part I

contains papers by Arnold Zellner, and by Paolo Giordani and Robert Kohn.

In his paper ‘‘Bayesian Econometrics: Past, Present, and Future,’’ Arnold

Zellner reviews problems faced by the Federal Reserve System, as described

by its former chairman, Alan Greenspan, and links these problems to a

summary of past and current Bayesian activity. Some key contributions to the

development of Bayesian econometrics are highlighted. Future research

directions are discussed with a view to improving current econometric

models, methods, and applications of them.

The other paper in Part I is a general one on a computational strategy for

improving MCMC. Under the title ‘‘Bayesian Inference using Adaptive

Sampling,’’ Paolo Giordani and Robert Kohn discuss simulation-based

Bayesian inference methods that draw on information from previous samples

to build the proposal distributions in a given family of distributions. The

article covers approaches along these lines and the intuition behind some of

the theory for proving that the procedures work. They also discuss strategies

for making adaptive sampling more effective and provide illustrations for

variable selection in the linear regression model and for time-series models

subject to interventions.

2. MICROECONOMETRIC MODELING

Part II of the book, entitled ‘‘Microeconometric Modeling’’ contains

applications that use cross-section or panel data. The paper by Murat K.

Munkin and Pravin K. Trivedi, ‘‘A Bayesian Analysis of the OPES Model

with a Nonparametric Component: An Application to Dental Insurance and

Dental Care,’’ is a good example of how Bayesian methods are increasingly

being used in important empirical work. The empirical focus is on the impact

of dental insurance on the use of dental services. Addressing this issue is

complicated by the potential endogeneity of insurance uptake and the fact

that insurance uptake may depend on explanatory variables in a nonlinear

fashion. The authors develop an appropriate model which addresses both

these issues and carry out an empirical analysis which ﬁnds strong evidence

Bayesian Econometrics: An Introduction

5

that having dental insurance encourages use of dentists, but also of adverse

selection into the insured state.

MCMC simulation techniques are particularly powerful in discrete-data

models with latent variable representations. In their paper ‘‘Fitting and

Comparison of Models for Multivariate Ordinal Outcomes,’’ Ivan Jeliazkov,

Jennifer Graves, and Mark Kutzbach review several alternative modeling

and identiﬁcation schemes for ordinal data models and evaluate how each

aids or hampers estimation using MCMC. Model comparison via marginal

likelihoods and an analysis of the effects of covariates on category probabilities is considered for each parameterization. The methods are applied to

examples in educational attainment, voter opinions, and consumers’ reliance

on alternative sources of medical information.

In ‘‘Intra-Household Allocation and Consumption of WIC-Approved

Foods: A Bayesian Approach,’’ Ariun Ishdorj, Helen H. Jensen, and Justin

Tobias consider the Special Supplemental Nutrition Program for Women,

Infants, and Children (WIC) that aims to provide food, nutrition education,

and other services to at-risk, low-income children and pregnant, breastfeeding, and postpartum women. They assess the extent to which the WIC

program improves the nutritional outcomes of WIC families as a whole,

including the targeted and nontargeted individuals within the household.

This question is considered under the possibility that participation in the

program (which is voluntary) is endogenous. They develop an appropriate

treatment–response model and conclude that WIC participation does not

lead to increased levels of calcium intake from milk.

A second paper that illustrates the use of Bayesian techniques for analyzing

treatment–response problems is that by Siddhartha Chib and Liana Jacobi.

In their paper ‘‘Causal Effects from Panel Data in Randomized Experiments

with Partial Compliance,’’ the authors describe how to calculate the causal

impacts from a training program when noncompliance exists in the training

arm. Two primary models are considered, with one model including a

random effects speciﬁcation. Prior elicitation is carefully done by simulating

from a prior predictive density on outcomes, using a hold out sample.

Estimation and model comparison are considered in detail. The methods are

employed to assess the impact of a job training program on mental health

scores.

Basic equilibrium job search models often yield wage densities that do not

accord well with empirical regularities. When extensions to basic models are

made and analyzed using kernel-smoothed nonparametric forms, it is difﬁcult

to assess these extensions via model comparisons. In ‘‘Parametric and

Nonparametric Inference in Equilibrium Job Search Models,’’ Gary Koop

6

SIDDHARTHA CHIB ET AL.

develops Bayesian parametric and nonparametric methods that are comparable to those in the existing non-Bayesian literature. He then shows how

Bayesian methods can be used to compare the different parametric and

nonparametric equilibrium search models in a statistically rigorous sense.

In the paper ‘‘Do Subsidies Drive Productivity? A Cross-Country Analysis

of Nordic Dairy Farms,’’ Nadine McCloud and Subal C. Kumbhakar

develop a Bayesian hierarchical model of farm production which allows for

the calculation of input productivity, efﬁciency, and technical change. The

key research questions relate to whether and how these are inﬂuenced by

subsidies. Using a large panel of Nordic dairy farms, they ﬁnd that subsidies

drive productivity through technical efﬁciency and input elasticities,

although the magnitude of these effects differs across countries.

The richness of available data and the scope for building ﬂexible models

makes marketing a popular area for Bayesian applications. In ‘‘Semiparametric Bayesian Estimation of Random Coefﬁcients Discrete Choice

Models,’’ Sylvie Tchumtchoua and Dipak K. Dey propose a semiparametric

Bayesian framework for the analysis of random coefﬁcients discrete choice

models that can be applied to both individual as well as aggregate data.

Heterogeneity is modeled using a Dirichlet process prior which (importantly)

varies with consumer characteristics through covariates. The authors employ

a MCMC algorithm for ﬁtting their model, and illustrate the methodology

using a household level panel dataset of peanut butter purchases, and

supermarket chain level data for 31 ready-to-eat breakfast cereals brands.

When diffuse priors are used to estimate simultaneous equation models,

the resulting posterior density can possess inﬁnite asymptotes at points of

local nonidentiﬁcation. Kleibergen and Zivot (2003) introduced a prior to

overcome this problem in the context of a restricted reduced form

speciﬁcation, and investigated the relationship between the resulting

Bayesian estimators and their classical counterparts. Arto Luoma and Jani

Luoto, in their paper ‘‘Bayesian Two-Stage Regression with Parametric

Heteroscedasticity,’’ extend the analysis of Kleibergen and Zivot to a

simultaneous equation model with unequal error variances. They apply their

techniques to a cross-country Cobb–Douglas production function.

3. TIME-SERIES MODELING

Part III of the volume is devoted to models and applications that use timeseries data. The ﬁrst paper in this part is ‘‘Bayesian Near-Boundary Analysis

in Basic Macroeconomic Time-Series Models’’ by Michiel D. de Pooter,

Bayesian Econometrics: An Introduction

7

Francesco Ravazzolo, Rene Segers, and Herman K. van Dijk. The boundary

issues considered by these authors are similar to that encountered by Arto

Luoma and Jani Luoto in their paper. There are a number of models where

the use of particular types of noninformative priors can lead to improper

posterior densities with estimation breaking down at boundary values of

parameters. The circumstances under which such problems arise, and how

the problems can be solved using regularizing or truncated priors, are

examined in detail by de Pooter et al. in the context of dynamic linear

regression models, autoregressive and error correction models, instrumental

variable models, variance component models, and state space models.

Analytical, graphical, and empirical results using U.S. macroeconomic data

are presented.

In his paper ‘‘Forecasting in Vector Autoregressions with Many

Predictors,’’ Dimitris Korobilis introduces Bayesian model selection methods

in a VAR setting, focusing on the problem of drawing inferences from a

dataset with a very large number of potential predictors. A stochastic search

variable selection algorithm is used to implement Bayesian model selection.

An empirical application using 124 potential predictors to forecast eight U.S.

macroeconomic variables is included to demonstrate the methodology.

Results indicate an improvement in forecasting accuracy over model

selection based on the Bayesian Information Criteria.

In ‘‘Bayesian Inference in a Cointegrating Panel Data Model,’’ Gary

Koop, Robert Leon-Gonzalez, and Rodney Strachan focus on cointegration

in the context of a cointegrating panel data model. Their approach allows

both short-run dynamics and the cointegrating rank to vary across crosssectional units. In addition to an uninformative prior, they propose an

informative prior with ‘‘soft homogeneity’’ restrictions. This informative

prior can be used to include information from economic theory that crosssectional units are likely to share the same cointegrating rank without forcing

that assumption on the data. Empirical applications using simulated data

and a long-run model for bilateral exchange rates are used to demonstrate

the methodology.

Cointegration is also considered by Deborah Gefang who develops tests of

purchasing power parity (PPP) within an exponential smooth transition

(ESVECM) framework. The Bayesian approach offers a substantial

methodological advantage in this application because the Gibbs sampling

scheme is not affected by the multi-mode problem created by nuisance

parameters. Results based on Bayesian model averaging and Bayesian model

selection ﬁnd evidence that PPP holds between the United States and each of

the remaining G7 countries.

8

SIDDHARTHA CHIB ET AL.

‘‘Bayesian Forecast Combination for VAR Models’’ by Michael K.

Andersson and Sune Karlsson addresses the issue of how to forecast a

variable (or variables) of interest (e.g., GDP) when there is uncertainty about

the dimension of the VAR and uncertainty about which set of explanatory

variables should be used. This uncertainty leads to a huge set of models. The

authors do model averaging over the resulting high-dimensional model space

using predictive likelihoods as weights. For forecast horizons greater than

one, the predictive likelihoods will not have analytical forms and the authors

develop a simulation method for estimating them. An empirical analysis

involving U.S. GDP shows the beneﬁts of their approach.

In ‘‘Bayesian Inference on Time-Varying Proportions,’’ William J.

McCausland and Brahim Lgui derive a highly efﬁcient algorithm for

simulating the states in state space models where the dependent variables are

proportions. The authors argue in favor of a model which is parameterized

such that the measurement equation has the proportions (conditional on the

states) following a Dirichlet distribution, but the state equation is a standard

linear Gaussian one. The authors develop a Metropolis–Hastings algorithm

which draws states as a block from a multivariate Gaussian proposal

distribution. Extensive empirical evidence indicates that their approach

works well and, in particular, is very efﬁcient.

Christopher J. O’Donnell and Vanessa Rayner use Bayesian methodology

to impose inequality restrictions on ARCH and GARCH models in their

paper ‘‘Imposing Stationarity Constraints on the Parameters of ARCH and

GARCH Models.’’ Bayesian model averaging is used to resolve uncertainty

with regard to model selection. The authors apply the methodology to data

from the London Metals Exchange and ﬁnd that results are generally

insensitive to the imposition of inequality restrictions.

In ‘‘Bayesian Model Selection for Heteroskedastic Models,’’ Cathy W. S.

Chen, Richard Gerlach, and Mike K. P. So discuss Bayesian model selection

for a wide variety of ﬁnancial volatility models that exhibit asymmetries (e.g.,

threshold GARCH models). Model selection problems are complicated by

the fact that there are many contending models and marginal likelihood

calculation can be difﬁcult. They discuss this problem in an empirical

application involving daily data from three Asian stock markets and

calculate the empirical support for their competing models.

Using a scale mixture of uniform densities representation of the Student-t

density, S. T. Boris Choy, Wai-yin Wan, and Chun-man Chan provide a

Bayesian analysis of a Student-t stochastic volatility model in ‘‘Bayesian

Student-t Stochastic Volatility Models via Scale Mixtures.’’ They develop a

Gibbs sampler for their model and show how their approach can be extended

Bayesian Econometrics: An Introduction

9

to the important class of Student-t stochastic volatility models with leverage.

The different models are ﬁt to returns on exchange rates of the Australian

dollar against 10 currencies.

In ‘‘Bayesian Analysis of the Consumption CAPM,’’ Veni Arakelian and

Efthymios G. Tsionas show that Labadie’s (1989) solution to the CAPM can

be applied to obtain a closed form solution and to provide a traditional

econometric interpretation. They then apply Bayesian inference to both

simulated data and the Mehra and Prescott (1985) dataset. Results generally

conform to theory, but also reveal asymmetric marginal densities for key

parameters. The asymmetry suggests that techniques such as generalized

method of moments, which rely on asymptotical approximations, may be

unreliable.

REFERENCES

Kleibergen, F., & Zivot, E. (2003). Bayesian and classical approaches to instrumental variable

regression. Journal of Econometrics, 114, 29–72.

Labadie, P. (1989). Stochastic inﬂation and the equity premium. Journal of Monetary

Economics, 24, 195–205.

Mehra, R., & Prescott, E. C. (1985). The equity premium: A puzzle. Journal of Monetary

Economics, 15, 145–162.

BAYESIAN ECONOMETRICS: PAST,

PRESENT, AND FUTURE

Arnold Zellner

ABSTRACT

After brieﬂy reviewing the past history of Bayesian econometrics and Alan

Greenspan’s (2004) recent description of his use of Bayesian methods in

managing policy-making risk, some of the issues and needs that he

mentions are discussed and linked to past and present Bayesian

econometric research. Then a review of some recent Bayesian econometric

research and needs is presented. Finally, some thoughts are presented that

relate to the future of Bayesian econometrics.

1. INTRODUCTION

In the ﬁrst two sentences of her paper, ‘‘Bayesian Econometrics, The First

Twenty Years,’’ Qin (1996) wrote, ‘‘Bayesian econometrics has been a

controversial area in the development of econometric methodology. Although

the Bayesian approach has been constantly dismissed by many mainstream

econometricians for its subjectivism, Bayesian methods have been adopted

widely in current econometric research’’ (p. 500). This was written more than

10 years ago. Now more mainstream econometricians and many others have

adopted the Bayesian approach and are using it to solve a broad range of

Bayesian Econometrics

Advances in Econometrics, Volume 23, 11–60

Copyright r 2008 by Emerald Group Publishing Limited

All rights of reproduction in any form reserved

ISSN: 0731-9053/doi:10.1016/S0731-9053(08)23001-X

11

12

ARNOLD ZELLNER

econometric problems in line with my forecast in Zellner (1974), ‘‘Further, it

must be recognized that the B approach is in a stage of rapid development

with work going ahead on many new problems and applications. While this is

recognized, it does not seem overly risky to conclude that the B approach,

which already has had some impact on econometric work, will have a much

more powerful inﬂuence in the next few years’’ (p. 54).

See also, Zellner (1981, 1988b, 1991, 2006) for more on the past, present,

and future of Bayesian econometrics in which it is emphasized that all

econometricians use and misuse prior information, subjectively, objectively,

or otherwise. And it has been pointed out that Bayesian econometricians

learn using an explicit model, Bayes’ Theorem that allows prior information

to be employed in a formal and reproducible manner whereas non-Bayesian

econometricians learn in an informal, subjective manner. For empirical

evidence on the rapid growth of Bayesian publications over the years in

economics and other ﬁelds that will be discussed below see Poirier (1989,

1992, 2004) and Poirier (1991) for an interesting set of Bayesian empirical

papers dealing with problems in economics and ﬁnance.

In the early 1990s, both the International Society for Bayesian Analysis

(http://www.bayesian.org) and the Section on Bayesian Statistical Science of

the American Statistical Association (http://www.amstat.org) were formed

and have been very active and successful in encouraging the growth of

Bayesian theoretical and applied research and publications. Similarly, the

NBER-NSF Seminar on Bayesian Inference in Econometrics and Statistics

(SBIES) that commenced operation in 1970, has been effective for many years

in sponsoring research meetings, publishing a number of Bayesian books and

actively supporting the creation of ISBA and SBSS in the early 1990s. In

Berry, Chaloner, and Geweke (1996), some history of the SBIES and a large

number of Bayesian research papers are presented. Also, under the current

leadership of Sid Chib, very productive meetings of this seminar in 2004 and

2005 have been held that were organized by him and John Geweke. In August

2006, the European–Japanese Bayesian Workshop held a meeting in Vienna

organized by Wolfgang Polasek that had a very interesting program. In 2005,

the Indian Bayesian Society and the Indian Bayesian Chapter of ISBA had an

international Bayesian meeting at Varanasi with many of the papers

presented that have appeared in a conference volume. In September 2006, a

Bayesian research meeting was held at the Royal Bank of Sweden, organized

by Mattias Villani that attracted leading Bayesian econometricians from all

over the world to present reports on their current work on Bayesian

econometric methodology. And now, this Advances in Econometrics volume

features additional valuable Bayesian econometric research. And last, but not

Bayesian Econometrics: Past, Present, and Future

13

least, the International Society for Bayesian Analysis has commenced

publication of an online Bayesian journal called Bayesian Analysis; see

http://www.bayesian.org for more information about this journal with

R. Kass the founding editor and listings of articles for several years that

are downloadable. These and many more Bayesian activities that have taken

place over the years attest to the growth and vitality of Bayesian analysis in

many sciences, industries, and governments worldwide.

1.1. An Example of Bayesian Monetary Policy-Making

As an example of extremely important work involving the use of Bayesian

methodology and analysis, Alan Greenspan, former Chairman of the U.S.

Federal Reserve System presented an invited paper, ‘‘Risk and Uncertainty in

Monetary Policy’’ at the 2004 Meeting of the American Economic

Association that was published in the American Economic Review in 2004

along with very knowledgeable discussion by Martin Feldstein, Harvard

Professor of Economics and President of the National Bureau of Economic

Research, Mervyn King of the Bank of England, and Professor Janet L.

Yellen of the Haas School of Business, University of California, Berkeley.

The paper is notable in that it presents a comprehensive description of the

ways in which he approached and solved monetary policy problems ‘‘ . . . from

the perspective of someone who has been in the policy trenches’’ (p. 33).

Greenspan’s account should be of interest to Bayesians econometricians

and many others since he states, ‘‘In essence, the risk management approach

to policymaking is an application of Bayesian decision-making’’ (p. 37). In

addition, he writes, ‘‘Our problem is not, as is sometimes alleged, the

complexity of our policy-making process, but the far greater complexity of a

world economy whose underlying linkages appear to be continuously evolving. Our response to that continuous evolution has been disciplined by the

Bayesian type of decision-making in which we have been engaged’’ (p. 39).

Feldstein (2004), after providing an excellent review of Greenspan’s

successful policy-making in the past wrote, ‘‘Chairman Greenspan emphasized that dealing with uncertainty is the essence of making monetary policy

(see also Feldstein, 2002). The key to what he called the risk-management

approach to monetary policy is the Bayesian theory of decision-making’’

(p. 42). After providing a brief, knowledgeable description of Bayesian

decision theory, Feldstein provides the following example to illustrate a case

of asymmetric loss in connection with a person making a decision whether to

carry an umbrella when the probability of rain is not high. ‘‘If he carries the

14

ARNOLD ZELLNER

umbrella and it does not rain, he is mildly inconvenienced. But if he does not

carry the umbrella and it rains, he will suffer getting wet. A good Bayesian

ﬁnds himself carrying an umbrella on many days when it does not rain. The

policy actions of the past year were very much in this spirit. The Fed cut the

interest rate to 1 percent to prevent the low-probability outcome of spiraling

deﬂation because it regarded that outcome as potentially very damaging

while the alternative possible outcome of a rise of the inﬂation rate from 1.5

percent to 2.5 percent was deemed less damaging and more easily reversed’’

(p. 42).

Mervyn King of the Bank of England commented knowingly about model

quality and policy-making, ‘‘Greenspan suggests that the risk-management

approach is an application of Bayesian decision-making when there is

uncertainty about the true model of the economy. Policy that is optimal in

one particular model of the economy may not be ‘robust’ across a class of

other models. In fact, it may lead to a very bad outcome should an

alternative model turn out to be true . . . Of course, although such an

approach is sensible, it is still vulnerable to policymakers giving excessive

weight to misleading models of the economy. . . . But, in the end, there is no

escaping the need to make judgments about which models are more plausible

than others’’ (pp. 42–43). These are indeed very thoughtful remarks about

problems of model uncertainty in making policy but do not recognize that

just as with Feldstein’s umbrella example above, a Bayesian analysis can

utilize posterior probabilities associated with alternative models that reﬂect

the quality of past performance that have been shown to be useful in

producing useful combined forecasts and probably will be helpful in dealing

with model uncertainty in policy-making.

1.2. Greenspan’s Policy-Making Problems

Below, I list and label important problems that Greenspan mentioned in

connection with his successful policy-making over the years that reveal his

deep understanding of both obvious and very subtle problems associated

with model-building, economic analyses, forecasting, and policy-making.

1. Structural changes: For example, ‘‘ . . . increased political support for

stable prices, globalization which unleashed powerful new forces of

competition, and an acceleration of productivity which at least for a time

held down cost pressures’’ (p. 33). ‘‘I believe that we at the Fed, to our

credit, did gradually come to recognize the structural economic changes

that we were living through and accordingly altered our understanding

Bayesian Econometrics: Past, Present, and Future

2.

3.

4.

5.

6.

15

of the key parameters of the economic system and our policy stance . . . .

But as we lived through it, there was much uncertainty about the

evolving structure of the economy and about the inﬂuence of monetary

policy’’ (p. 33).

Forecasting: ‘‘In recognition of the lag in monetary policy’s impact on

economic activity, a preemptive response to the potential for building

inﬂationary pressures was made an important feature of policy. As a

consequence, this approach elevated forecasting to an even more

prominent place in policy deliberations’’ (p. 33).

Unintended consequences: ‘‘Perhaps the greatest irony of the past decade

is that the gradually unfolding success against inﬂation may well have

contributed to the stock price bubble of the latter part of the

1990s . . . The sharp rise in stock prices and their subsequent fall were,

thus, an especial challenge to the Federal Reserve’’ (p. 35).

‘‘The notion that a well-timed incremental tightening could have been

calibrated to prevent the late 1990s bubble while preserving economic

stability is almost surely an illusion. Instead of trying to contain a

putative bubble by drastic actions with largely unpredictable consequences, we chose . . . to focus on policies to mitigate the fallout when it

occurs and, hopefully, ease the transition to the next expansion’’ (p. 36).

Uncertainty: ‘‘The Federal Reserve’s experiences over the past two

decades make it clear that uncertainty is not just a pervasive feature of

the monetary landscape; it is the deﬁning characteristic of that

landscape. The term ‘‘uncertainty’’ is meant here to encompass both

‘Knightian uncertainty,’ in which the probability distribution of

outcomes is unknown, and ‘risk,’ in which uncertainty of outcomes is

delimited by a known probability distribution. In practice, one is never

quite sure what type of uncertainty one is dealing with in real time, and it

may be best to think of a continuum ranging from well-deﬁned risks to

the truly unknown’’ (pp. 36–37).

Risk management: ‘‘As a consequence, the conduct of monetary policy in

the United States has come to involve, at its core, crucial elements of risk

management. This conceptual framework emphasizes understanding as

much as possible the many sources of risk and uncertainty that

policymakers face, quantifying those risks, when possible, and assessing

costs associated with each of the risks. In essence, the risk-management

approach to monetary policymaking is an application of Bayesian

decision-making’’ (p. 37).

Objectives: ‘‘This [risk management] framework also entails devising, in

light of those risks, a strategy for policy directed at maximizing the

16

7.

8.

9.

10.

11.

ARNOLD ZELLNER

probabilities of achieving over time our goals of price stability and the

maximum sustainable economic growth that we associate with it’’ (p. 37).

Expert opinion: ‘‘In designing strategies to meet our policy objectives, we

have drawn on the work of analysts, both inside and outside the Fed,

who over the past half century have devoted much effort to improving

our understanding of the economy and its monetary transmission

mechanism’’ (p. 37).

Model uncertainty: ‘‘A critical result [of efforts to improve our

understanding of the economy and its monetary transmission mechanism] has been the identiﬁcation of a relatively small set of key

relationships that, taken together, provide a useful approximation of

our economy’s dynamics. Such an approximation underlies the statistical

models that we at the Federal Reserve employ to assess the likely

inﬂuence of our policy decisions.

However, despite extensive efforts to capture and quantify what we

perceive as the key macroeconomic relationships, our knowledge about

many of the important linkages is far from complete and, in all likelihood

will always remain so. Every model, no matter how detailed or how well

designed, conceptually and empirically, is a vastly simpliﬁed representation of the world that we experience with all its intricacies on a day-today basis’’ (p. 37).

Loss structures: ‘‘Given our inevitably incomplete knowledge about key

structural aspects of an ever-changing economy and the sometimes

asymmetric costs or beneﬁts of particular outcomes, a central bank

needs to consider not only the most likely future path for the economy,

but also the distribution of possible outcomes about that path. The

decision-makers then need to reach a judgment about the probabilities,

costs and beneﬁts of the various possible outcomes under alternative

choices for policy’’ (p. 37).

Robustness of policy: ‘‘In general, different policies will exhibit different

degrees of robustness with respect to the true underlying structure of the

economy’’ (p. 37).

Cost–beneﬁt analysis: ‘‘As this episode illustrates, policy practitioners

operating under a risk-management paradigm may, at times, be led to

undertake actions intended to provide insurance against [low probability] especially adverse outcomes . . . . The product of a lowprobability event and a potentially severe outcome was judged a more

serious threat to economic performance than the higher inﬂation that

might ensue in the more probable scenario’’ (p. 37).

Bayesian Econometrics: Past, Present, and Future

17

12. Knightian uncertainty: ‘‘When confronted with uncertainty, especially

Knightian uncertainty, human beings invariably attempt to disengage

from medium- to long-term commitments in favor of safety and

liquidity. Because economies, of necessity, are net long (that is, have net

real assets) attempts to ﬂee these assets causes prices of equity assets to

fall, in some cases dramatically . . . The immediate response on the part

of the central bank to such ﬁnancial implosions must be to inject large

quantities of liquidity . . . ’’ (p. 38).

13. Parameters (ﬁxed- and time-varying): ‘‘The economic world in which we

function is best described by a structure whose parameters are

continuously changing. . . . We often ﬁt simple models [with ﬁxed

parameters] only because we cannot estimate a continuously changing

set of parameters without vastly more observations than are currently

available to us’’ (p. 38).

14. Multiple risks: ‘‘In pursuing a risk-management approach to policy, we

must confront the fact that only a limited number of risks can be

quantiﬁed with any conﬁdence . . . . Policy makers often have to act, or

choose not to act, even though we may not fully understand the full

range of possible outcomes, let alone each possible outcome’s likelihood. As a result, risk management often involves signiﬁcant judgment

as we evaluate the risks of different events and the probability that our

actions will alter those risks’’ (p. 38).

15. Policy rules: ‘‘For such judgment [mentioned above], policymakers have

needed to reach beyond models to broader, though less mathematically

precise, hypotheses about how the world works. For example, inferences

about how market participants and, hence, the economy might respond

to a monetary policy initiative may need to be drawn from evidence

about past behavior during a period only roughly comparable to the

current situation.

Some critics have argued that such an approach to policy is too

undisciplined – judgmental, seemingly discretionary, and difﬁcult to

explain. The Federal Reserve, they conclude, should attempt to be more

formal in its operations by tying its actions, solely, on the weaker

paradigm, largely, to the prescriptions of a simple policy rule. Indeed,

rules that relate the setting of the federal funds rate to the deviations of

output and inﬂation from their respective targets, in some conﬁgurations,

do seem to capture the broad contours of what we did over the past

decade and a half. And the prescriptions of formal rules can, in fact,

serve as helpful adjuncts to policy, as many of the proponents of these

18

16.

17.

18.

19.

ARNOLD ZELLNER

rules have suggested. But at crucial points, like those of our recent policy

history (the stock market crash of 1987, the crises of 1997–1998, and the

events that followed September, 2001), simple rules will be inadequate as

either descriptions or prescriptions for policy. Moreover, such rules

suffer from much of the same ﬁxed-coefﬁcient difﬁculties we have with

our large-scale models’’ (pp. 38–39).

Forecasting: ‘‘While all, no doubt, would prefer that it were otherwise,

there is no way to dismiss what has to be obvious to every monetary

policymaker. The success of monetary policy depends importantly on

the quality of forecasting. The ability to gauge risks implies some

judgment about how current economic imbalances will ultimately play

out . . . . Thus, both econometric and qualitative models need to be

continually tested’’ (p. 39).

Monetary policy: ‘‘In practice, most central banks, at least those not

bound by an exchange-rate peg, behave in roughly the same way. They

seek price stability as their long term goal and, accounting for the lag in

monetary policy, calibrate the setting of the policy rate accordingly. . . .

All banks ease when economic conditions ease and tighten when

economic conditions tighten, even if in differing degrees, regardless of

whether they are guided by formal or informal inﬂation targets’’ (p. 39).

Uncontrolled outcomes and targets: ‘‘Most prominent is the appropriate

role of asset prices in policy. In addition to the narrower issue of

product price stability, asset prices will remain high on the research

agenda of central banks for years to come. . . . There is little dispute

that the prices of stocks, bonds, homes, real estate, and exchange rates

affect GDP. But most central banks have chosen, at least to date, to

view asset prices not as targets of policy, but as economic variables to be

considered through the prism of the policy’s ultimate objective’’ (p. 40).

Performance rating: ‘‘We were fortunate . . . to have worked in a

particularly favorable structural and political environment. But we trust

that monetary policy has meaningfully contributed to the impressive

performance of our economy in recent decades’’ (p. 40). Further

evaluation of current monetary policies dealing with the 2007–2008

credit crisis is an important issue.

1.3. Greenspan’s Problems and Econometric Research

It is of interest to relate Greenspan’s problem areas to current and past

Bayesian econometric research. In econometric research, along with other

19

Bayesian Econometrics: Past, Present, and Future

scientiﬁc research, three main areas of activity have been recognized,

namely, deduction, induction, and reduction, see Jeffreys (1957, 1939 [1998])

and Zellner (1985, pp. 3–10 and 1996, Chapter 1) for discussions of these

topics and references to the huge literature on the deﬁnitions and other

aspects of these research areas. Brieﬂy, deduction involves use of logic and

mathematics to prove propositions given certain assumptions. Induction

involves development and use of measurement, description, estimation,

testing, prediction, and decision-making procedures, while reduction

involves creating new models and methods that are helpful in explaining

the past, predicting as yet unobserved outcomes at various places and/or

times and in solving private and public decision problems.

While much more can be and has been said about deduction, induction,

and reduction, most will agree about the difﬁculty of producing good new or

improved models that work well in explanation, prediction, and decisionmaking. However, as we improve our understanding of these three areas and

their interrelations in past and current work and engage in more empirical

predictive and other testing of alternative models and methods, testing that

is much needed in evaluation of alternative macroeconomic models, as

emphasized by Christ (1951, 1975), Fair (1992), and many others, more

rapid progress will undoubtedly result.

A categorization of Greenspan’s problems by their nature is shown in

Table 1.

It is seen that many of Greenspan’s problems have a deductive or

theoretical aspect to them but, as recognized in the literature, deduction

alone is inadequate for scientiﬁc work for a variety of reasons, perhaps best

summarized by the old adage, ‘‘Logical proof does not imply complete

certainty of outcomes,’’ as widely appreciated in the philosophical literature

and elsewhere. Perhaps, the most striking aspect of Table 1 is the large

number of entries in category III, reduction. Economic theorists, econometricians, and others have to get busy producing new models and methods that

are effective in helping to solve former Chairman Greenspan’s and now

Chairman Bernanke’s problems. See Hadamard (1945) for the results of a

Table 1.

Categories

(I) Deduction

(II) Induction

(III) Reduction

Tabulation of Greenspan’s Problems Listed Above.

Problem Numbers

3, 4, 6, 9, 10, 11, 12, 13, 14, 16, 17, 19

2, 3, 6, 7, 8, 9, 10, 11, 13, 14, 15, 16, 17, 18, 19

1, 4, 8, 12, 13, 15, 16, 18

20

ARNOLD ZELLNER

survey of mathematicians that provides information on how major breakthroughs in mathematics occurred and tips on how to create new theories in

mathematics that may also be helpful in reductive econometric work as

discussed in Zellner (1985, pp. 8–10). Also, in Zellner and Palm (2004) some

methods for creating new econometric models and checking old econometric

models and applications of them by a number of researchers are presented

that may be helpful in the production of new econometric models that

perform well in explanation, prediction, and policy-making. More will be

said about this reductive problem area below.

1.4. Overview of Paper

With this in the way of an introduction to some current problems facing us,

in Sections 2 and 3 we shall review some early and recent work in Bayesian

econometrics and relate it to some of the problems mentioned by Chairman

Greenspan and consider future possible developments in Bayesian econometrics in Section 4.

2. THE PAST

2.1. Early Bayesian Econometrics

As is the case with many others who commenced study of econometrics in

the 1950s, in my graduate econometrics courses at the University of

California at Berkeley there was no mention of Bayesian topics except in a

game theory course that I took with David Blackwell (who many years later

introduced an elementary Bayesian statistics course at Berkeley using Berry’s

(1996) text). Also, there was no mention of Bayesian analysis in Tintner’s

(1952) popular text or in most Cowles Commission publications. Although,

in Klein’s Textbook of Econometrics (1953, p. 62) some discussion of

Bayesian decision theory along with a reservation about prior distributions

appeared that he apparently abandoned later in an invited paper, ‘‘Whither

Econometrics?’’ published in JASA in which Klein (1971) wrote, ‘‘Bayesian

methods attempt to treat a priori information in a systematic way. As a pure

and passive forecaster of econometric methodology I can see a great deal of

future research effort being channeled in that direction. Systematic ways of

introducing a priori information are to be desired’’ (p. 420). Also Theil’s

(1978) econometrics text included a chapter titled, ‘‘Bayesian Inference and

Bayesian Econometrics: Past, Present, and Future

21

Rational Random Behavior’’ in which he explained Bayes’ theorem and

provided some interesting applications of it. However, he expressed strong

reservations about improper prior distributions and also wrote, ‘‘The

Bayesian approach is itself a matter of considerable controversy. This is not

surprising, given that the approach takes a fundamentally different view of

the nature of the parameters by treating them as random variables’’ (p. 254).

There is no question but that Klein’s forecast regarding future

econometric methodology, presented above, has been quite accurate. Much

past and current Bayesian research is indeed focused on how to formulate

and use prior distributions and models that incorporate ‘‘a priori

information’’ in analyses of a wide range of estimation, prediction, and

control problems with applications in many ﬁelds using ﬁxed and random

parameter models. See the early papers by Dre`ze (1962), Rothenberg (1963),

and Zellner (1965) presented at the ﬁrst World Congress of the Econometric

Society in Rome, 1965 for some Bayesian results for analyzing the important

simultaneous equations model. In my paper, I presented some numerical

integration results, obtained using ‘‘old-fashioned’’ numerical integration

methods that were of great interest to Malinvaud whose well-known 1964

(translated from French into English in 1966) econometrics text, along with

many others, made no mention of Bayes and Bayesian methods, nor of the

early Bayesian papers that Qin (1996) cites: ‘‘The early 1960s saw pioneering

Bayesian applications in econometrics. These included published works by

Fisher (1962), Hildreth (1963), Tiao and Zellner (1964, 1965) and Zellner

and Tiao (1964) and unpublished works by Dre`ze (1962) and Rothenberg

(1963)’’ (pp. 503–504). See also Chetty (1968) for an early Bayesian analysis

of macroeconomic models introduced by Haavelmo.

In spite of these and a number of other theoretical and applied Bayesian

publications that appeared in the 1960s and early 1970s, in the 1974,

completely revised, 2nd edition of Klein’s Textbook Of Econometrics, he

wrote:

Bayes’ theorem gives a logical method of making probability inferences if the a priori

probabilities are known. They seldom are known and this is the objection to the use of

this theorem for most problems of statistical inference. A major contribution of decision

function theory [that he ably describes in this chapter of his book] is to show the relation

of various inferences to Bayes’ type solutions. The beauty of the theory is that it includes

hypothesis testing and estimation methods as special cases of a more general approach to

inference. (p. 64)

It is clear that Klein, along with Dre`ze, Leamer, and some other econometricians, had a deep understanding of the decision theoretic approach to

Bayesian statistical inference that Ramsey, Savage, Friedman, Raiffa,

## Recent Advances in Plant Biotechnology

## Báo cáo y học: "Guest Editor’s Editorial: Advances in Managing Hepatitis C Virus (HCV) Infection (A Special Issue)"

## Báo cáo y học: "Advances in immunomodulating therapy of HBV infection"

## Báo cáo y học: "Advances in Molecular Diagnosis of HBV Infection and Drug Resistance"

## Advances in StentTherapy for Ischaemic Heart Disease

## ADVANCES IN HOSPITALITY AND LEISURE

## Advances in Enzyme Technology – UK Contributions

## Advances in Relationship Marketing Thought and Practice: The Influence of Social Network Theory

## Recent Advances in Plant Biotechnology

## Tài liệu Advances in Database Technology- P1 pptx

Tài liệu liên quan