汉森的研究反省:充分利用数据来对经济建模 | 计算机和经济学交叉的机会

文末有计量经济圈重大提议:

汉森,2013年诺贝尔经济学奖获得者之一,是动态经济学和金融资产定价领域的顶级专家,现在研究注意力主要集中在“不确定性”方面。这两篇研究反省文章恰恰是与当下发展火热的计量经济学相关的,计量经济圈作为对应订阅号有义务拿出来分享。

汉森的研究反省:充分利用数据来对经济建模 | 计算机和经济学交叉的机会

第一篇:充分利用数据来对经济建模
Modeling to Make the Most of Data
Economic modeling is often driven by discovery or development of new data sets. For many substantive applications, the data does not simply “speak for itself,” making it important to build structural models to interpret the evidence in meaningful ways.

For example, the availability of data from national income and product accounts was an early influence on formal macroeconomic modeling. As other evidence on the economic behavior of individuals and firms became available, builders of dynamic economic models incorporated microeconomic foundations in part to bring to bear a broader array of evidence on macroeconomic policy challenges. Similarly, econometricians built and applied new methods for panel data analysis to understand better the empirical implications of microeconomic data with statistical rigor.

As large cross-sections of financial market returns became easily accessible to researchers, asset pricing theorists built models that featured economically interpretable risk-return tradeoffs to give an economic interpretation to the observable patterns in financial market data.In all of these cases, data availability provoked new modeling efforts, and these efforts were crucial in bringing new evidence to bear on important policy-relevant economic questions.

Rapid growth in computing power and the expansion of electronic marketplaces and information sharing to all corners of life have vastly expanded the data available to individuals, enterprises, and policy makers. Important computational advances in data science have opened the door to analyses of massive new data sets, which potentially offers new insights to a variety of questions important in economic analysis. The richness of this new data provides flexible ways to make predictions about individual and market behavior— for example, the assessment of credit risk when taking out loans and implications for markets of consumer goods including housing.

These are topics explored at previous institute events such as the Macro Financial Modeling 2016 Conference held on January 27–29, 2016. One example is the work of Matthew Gentzkow and Jesse Shapiro and a conference on the use of text as data. Another example is the construction and use of new measures of policy uncertainty developed by Scott R. Baker, Nicholas Bloom, and Steve Davis.

Just as statisticians have sought to provide rigor to the inferential methods used in the data analysis, econometricians now have new challenges in enriching these modeling efforts beyond straightforward data description and prediction. While the data may be incredibly rich along some dimensions, many policy-relevant questions require the ability to transport this richness into other hypothetical settings, as is often required when we wish to know likely responses to new policies or changes in the underlying economic environment. This more subtle but substantively important form of prediction requires both economic and statistical modeling to fully exploit the richness of the data and the power of the computational methods.

The door is wide open for important new advances in economic modeling well suited to truly learn from the new data. By organizing conferences like the  September 23-24, 2016 event, the Becker Friedman Institute is nurturing a crucial next step of how best to integrate formal economic analysis to address key policy questions. We seek to foster communication among a variety of scholars from computer science, statistics, and economics in addressing new research challenges. This conference, organized by Stephane Bonhomme, John Lafferty, and Thibaut Lamadon, will encourage the synergistic research efforts of computational statistics and econometrics.

— Lars Peter Hansen, Becker Friedman Institute Director

汉森的研究反省:充分利用数据来对经济建模 | 计算机和经济学交叉的机会

第二篇:计算机和经济学交叉所出现的机会

Opportunity at the Intersection of Computation and Economics
Following the tradition of interdisciplinary collaboration will yield exciting research
In the 1940s and 1950s, the Cowles Commission, then at the University of Chicago, brought together economic scholars together with eminent statisticians and applied mathematicians who pioneered exciting new lines of research in mathematically-oriented economic theory and econometrics. Their intellectual probes, nurtured by cross-disciplinary interactions, had a profound impact on economic research over the next decades.

Today, as our computational power continues to expand, it opens the door to new and exciting approaches to research. Following in the tradition of the Cowles Commission, in the next few months the Becker Friedman Institute will be exploring how computation can nurture new approaches to economic research by bringing together computational experts and economists to engage in productive exchanges of ideas along two different fronts.

One area we are exploring is how computing power enhances development of economic theory. For example, economists often use rationality hypotheses when building models. It is understood this approach is at best an approximation of individual’s behavior and decision-making. This has led many researchers to explore alternative notions of bounded rationality in complex economic environments in which the approximation of full rationality is harder to defend. Among other things, models with bounded rationality impose limitations on the computational effort required for the full optimization.

Meanwhile, advances in information technology has led to the emergence of new markets with new forms of exchange. Computational advances offer approaches that can approximate behavioral interactions in these new types of market interactions. Our 2015–16 Research Fellows, Ben Brooks and Mohammad Akbarpour, have organized a conference in August on the Frontiers of Economic Theory and Computer Science that will bring together economists and computer scientists to explore promising new research directions in this exciting area of endeavor.

On a related front, data science has brought together computational and statistical expertise to study so-called “machine learning” approaches to the analysis of large scale data accumulating from everyday transactions in every area of our lives. The institute is probing the question of how to use such approaches in conjunction with economic models that allow us to study important policy questions. Comparing alternative policy options often requires that we engage in the analysis of counterfactuals. This requires that we extrapolate what we have learned from rich data to realms where data is more sparse, using what we call structural economic models. In this vein, the analysis of new and rich data will lead to new economic models designed to address important policy questions. A conference I am organizing with my colleagues Stéphane Bonhomme, John Lafferty, and Thibaut Lamadon will bring together econometricians and statisticians to probe new opportunities for advancement in this exciting synergistic area of research.

While two conferences alone cannot hope to meet the impact of almost two decades of influential work that emerged from the Cowles Commission, they will help to encourage some exciting directions for synergistic research and innovation at the intersection of computation, statistics and economic analysis.

—Lars Peter Hansen

注:来源于 http://larspeterhansen.org/

重要提议:

上一日分享了《时间序列分析三十讲》视频,圈圈想在这里说一下,为什么给圈友们两种选择:①1元费用,②资料置换。我们主要考虑的其实倾向于第二种,让各位圈友所保有的计量经济资料能够互通有无,这样以后就可以以极低的流通代价让彼此都能够看到。

不过,出乎意料之外的事情,是大多数圈友选择的是第一种方式。这就让计量经济圈不能够达到圈内资源互通,不过我们相信,私人资源还是需要通过一些价值来提供展现的机会。圈友们有计量资源提供的和有需要的,都可以通过计量经济圈主页菜单“问题交流”的联系方式,来告诉我们,这样可以设计一些双方都满意的机制来促进资源流动。

猜你喜欢

转载自blog.51cto.com/15057855/2683278