欢迎来到报告吧! | 帮助中心 分享价值,成长自我!

报告吧

换一换
首页 报告吧 > 资源分类 > PDF文档下载
 

纸币悖论:理解交易动机以外的现金需求-20页_689kb.pdf

  • 资源ID:132455       资源大小:689.52KB        全文页数:20页
  • 资源格式: PDF        下载积分:15金币 【人民币15元】
快捷下载 游客一键下载
会员登录下载
三方登录下载: 微信开放平台登录 QQ登录  
下载资源需要15金币 【人民币15元】
邮箱/手机:
温馨提示:
用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)
支付方式: 支付宝    微信支付   
验证码:   换一换

加入VIP,下载共享资源
 
友情提示
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,既可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

纸币悖论:理解交易动机以外的现金需求-20页_689kb.pdf

BIS Working Papers No 980 What does machine learning say about the drivers of inflation? by Emanuel Kohlscheen Monetary and Economic Department November 2021 (this version February 2022)JE clas L E30,sification: E31, E37, E52,E27, F41. Keywords: expectations, forecast, inflation, machine learning, oil price, output gap, Phillips curve. BIS Working Papers are written by members of the Monetary and Economic Department of the Bank for International Settlements, and from time to time by other economists, and are published by the Bank. The papers are on subjects of topical interest and are technical in character. The views expressed in them are those of their authors and not necessarily the views of the BIS. This publication is available on the BIS website (bis). Bank for International Settlements 2021. All rights reserved. Brief excerpts may be reproduced or translated provided the source is stated. ISSN 1020-0959 (print) ISSN 1682-7678 (online) 1 What does machine learning say about the drivers of inflation? E. Kohlscheen1,2Abstract This paper examines the drivers of CPI inflation through the lens of a simple, but computationally intensive machine learning technique. More specifically, it predicts inflation across 20 advanced countries between 2000 and 2021, relying on 1,000 regression trees that are constructed based on six key macroeconomic variables. This agnostic, purely data driven method delivers (relatively) good outcome prediction performance. Out of sample root mean square errors (RMSE) systematically beat even the in-sample benchmark econometric models, with a 28% RMSE reduction relative to a nave AR(1) model and a 8% RMSE reduction relative to OLS. Overall, the results highlight the role of expectations for inflation outcomes in advanced economies, even though their importance appears to have declined somewhat during the last 10 years. JEL Classification: E27; E30; E31; E37; E52; F41 Keywords: expectations; forecast; inflation; machine learning; oil price; output gap; Phillips curve. 1 Bank for International Settlements. Centralbahnplatz 2, 4051 Basel, Switzerland. E-Mail address: emanuel.kohlscheenbis. 2 I am grateful to Deniz Igan and Daniel Rees for providing useful comments. The views expressed in this paper are those of the author and do not necessarily reflect those of the Bank for International Settlements. 2 1. IntroductionWhat are the key drivers of inflation? And which role do expectations play in the inflation process? These have been long standing questions in macroeconomics, particularly given their high relevance to economic policy making. Indeed, the paper that is often credited with having started the rational expectations revolution (Muth (1961) was concerned exactly with the above questions. The current study attempts to shed some fresh light on these core macroeconomic questions. It does so through the lens of a flexible non-parametric data driven method. Specifically, it applies the well-established random forest approach (Breiman et al (1984), Breiman (2001) to disentangle the drivers of inflation since 2000 across 20 advanced economies. Beyond comparing explanatory performance with traditional econometric benchmarks, as far as possible, it tries to interpret the economic reasons that are behind the (relative) success of the technique in explaining recent consumer price inflation. Overall, the analysis attests the relative strong performance of the random forest model in predicting contemporaneous and future headline and core CPI inflation, even when only a small standard set of macroeconomic indicators is used. In fact, the out of sample root mean square error (RMSE) of the machine learning (ML) model beats even the in-sample performance of standard OLS using the same set of explanatory variables/features which are firmly grounded on economic theory. This suggests that non-linearities play an important role in explaining inflation. Overall, expectations emerge as the most important predictor of CPI inflation, followed by past inflation. That said, the importance of expectations has declined during the last 10 years. During this period, the partial effects that are teased out from the random forest model point to a flattening of the effects of expectations when these are above 2%. Throughout, exchange rate variations are found to add relatively little value in predicting inflation outcomes. Relation to the literature. The paper builds on a growing literature that applies machine learning (ML) to economics. Kleinberg et al (2015) discuss the advantages and caveats of applying ML techniques to economic prediction problems. They argue that ML provides a disciplined non-parametric way to predict economic outcomes. Mullainathan and Spiess (2017) offer an example of how regression trees can be used to better predict house prices. They conclude their review by stating that “machine learning provides a powerful tool to hear, more clearly than ever, what the data have to say”. As such, it can be a useful complement to more traditional model based methods.3 3 Earlier, Varian (2014) provided an example of regression trees for predicting mortgage approvals. 3 Chakraborty and Joseph (2017) compare the inflation prediction performance of 10 econometric and machine learning models for the United Kingdom. They find that, post-GFC, random forests provide the best prediction performance among stand-alone models in the testing sample.4 More recently, Medeiros et al (2021) compare different ML methods performance to predict inflation in the United States. Also these authors conclude that random forests dominate all other methods. These findings confirm more general ones by Fernandez-Delgado et al (2014) who compared the performance of 179 classifier models across 121 datasets and found (the relatively simple) random forest to be the top performer among all options. Coulombe (2021) provides an early attempt to combine the random forest methodology with a macroeconomic model. The current paper contributes to deepen our understanding of the drivers of inflation relying on a purely data driven method. The results reassert the importance of expectations in the price formation process. This emerges from the relative inflation predictor importances and from eliciting the partial effects of expectations on inflation outcomes. In this sense, it strengthens the case for current inflation targeting frameworks. Second, it highlights the importance of non-linearities, for instance for the effect of the output gap on inflation, as well as for how expectations translate into price pressures. Third, it finds that oil price movements and global PPI inflation are also important drivers of CPI inflation in advanced economies, indicating a global dimension of inflation particularly after 2010. In this respect, the paper is also related to the large literature that examines the role of global drivers of inflation.5 Interestingly, a recent paper by Kamber and Wong (2020) finds that while global factors do play a substantial role in explaining the inflation gap, they generally do not explain the inflation trend in advanced economies suggesting that policies play a key role.6 Outline. The article proceeds as follows. Section 2 explains the methodology and the dataset that was used to predict current and future inflation. Section 3 presents the baseline results. Section 4 explores how drivers of inflation have been changing over time. Section 5 examines the performance of random forests in forecasting inflation 6 and 12 months ahead. Section 6 presents several robustness checks. The paper concludes by suggesting avenues for further research. 4 Page 56, Table 8. 5 See e.g. Borio and Filardo (2007), Monacelli and Sala (2009), Cicccarelli and Mojon (2010), Neely and Rapach (2011), Mumtaz and Surico (2012), Gillitzer and McCarthy (2019) and Forbes (2019), among others. 6 Note that, as in Jasova et al (2019) and Forbes (2019), in this paper policies are captured indirectly through their effects on expectations, past inflation, output gaps and exchange rates. 4 2. Modelling Headline Inflation with Regression Trees 2.1. Data and feature selection Given that most central banks calibrate their monetary policy based on targets for headline CPI inflation, this study assesses the drivers of this indicator for a broad set of advanced economies. More specifically, it compares the performance of standard regression techniques with that of a widely established machine learning technique (random forests) in predicting contemporaneous and future quarterly seasonally adjusted CPI inflation in 20 advanced countries. The countries that are included in the study are all that have a population of at least one million and current GDP per capita above $25,000. They are Austria, Belgium, Canada, Czechia, Denmark, Estonia, Finland, France, Germany, Ireland, Italy, the Netherlands, Norway, Portugal, Slovenia, Spain, Sweden, Switzerland, the United Kingdom and the United States. The time span of the analysed data goes from 2000 to mid-2021. As potential explanatory variables, key factors that have been found to be of importance in the theoretical and applied literature are included. Broadly speaking, the list of factors is fairly similar to that used in recent papers by Forbes (2019) and Jasova et al (2019), among others. They include six variables: lagged inflation, to capture the persistence of the inflation process; 12-month ahead inflation expectations, as surveyed on a monthly basis by Consensus Economics from a representative group of banks;7 the output gap, which was computed after applying a one-sided HP filter to the real GDP series;8 the cumulative percentage variation of the oil price (Brent) over a year; the similar variation of the BIS nominal effective exchange rate for each country and the average PPI inflation measure in the three major economies (the United States, the euro area and China). The latter captures factory gate inflation, and is likely to reflect the impact of input costs. Together with oil prices, it also captures an international dimension of CPI inflation.9 10 Note that by limiting the set of explanatory variables (“features”) that are used to those that are firmly grounded in the literature, the advice of Chakraborty and Joseph (2017) to limit complexity from the onset is followed. The average annualized headline inflation in the panel is 1.87%, with a standard deviation of 3.20%. For reference, average core inflation is 1.59% 7 Carrol (2003) finds that the dynamics of expectations are well captured by a model in which households expectations derive from news reports about professional forecasters expectations. 8 Contrary to the two-sided version, the one-sided version of the filter does not rely on information after the quarter in question. 9 Note that, labour market slack is included indirectly through the output gap. 10 Data are sourced from the BIS, Consensus Economics, the OECD and Bloomberg. 5 (standard deviation of 1.61%). 12-month ahead headline inflation expectations average 1.80%, with a standard deviation of 0.89%.11 Pooled OLS based on the six explanatory variables is able to explain 39% of the variation in contemporaneous quarter headline inflation outcomes (see Table A1 in the Appendix). All six correlates clearly correlate strongly with inflation, with robust t-statistics that vary between 2.0 (for the output gap) and 42.1 (for inflation expectations). The F-statistic of the model is 523.0 (p-value 0.001). 2.2. Growing the Random Forests CPI inflation is predicted by means of regression trees and random forests, as outlined in Breiman et al (1984) and Breiman (2001). The main advantage of these methods is that they are able to accommodate non-linearities, as well as to capture potentially complex interactions between the explanatory variables. Essentially, inflation prediction is treated as a classification problem.12 The regression tree algorithm mechanically grows trees based on successive splits of the panel according to an explanatory variable (the feature) and an associated threshold level that minimizes the mean of squared residuals after the split. This is done again and again for each new node until a pre-defined tree depth is reached (the “stopping criterion”). The algorithm then takes the average of the target variable over all observations in a given final node as a prediction, and then compares it with actual values of the variable of interest for each observation. To introduce randomness, a large ensemble of trees is built, based on randomly selected subsamples. For training each tree, the algorithm keeps one third of the sample out-of-bag for posterior testing and grows regression trees based on the remaining observations (the “training sample”). Instead of relying on only one tree, outcome predictions are then based on the average values of the outcome variable for terminal nodes of a large number of trees, that is the random forest.13 The main advantage of using a large number of trees is that the variance of the predictions declines and overfitting is minimized (Friedman et al (2009), Chakraborty and Joseph (2017).14 Figure 1 shows that, for the current application, the overall MSE declines very rapidly as the number of trees grows. Most gains in accuracy are already evident after the threshold of 10 trees is passed. 11 These were computed based on a properly weighted average of current and next year forecasts. That is, with geometric weights that are based on the number of the next 12 months that fall within the current year and the following year. 12 Note that the analysis first looks at contemporaneous prediction, i.e. model fit in the econometric sense. Later we analyse actual forecasts. 13 This is known as bagging. 14 Mentch and Zhou (2020) show that “the additional randomness injected into individual trees serves as a form of implicit regularization, making random forests an ideal model in low signal to noise (SNR) settings.” 6 2.3. Measuring Performance Table 1 shows model RMSEs for inflation, as well as RMSE ratios between the regression tree method and traditional econometric benchmarks for forests of 100 and 1,000 trees. It does so for different pre-specified tree depths. The parameter that controls the depth of grown trees (pruning) is the minimum number of observations per parent node (which is denoted p). A lower minimum parent size of a splitting node means that deeper trees are grown. Too deep trees generally imply overfitting, which may come at the cost of reducing its flexibility and out of sample performance. Shallower trees can be more robust for use with new incoming data. The asymptotic limit to gains in predictive performance that is apparent from Graph 1 is again reflected in the fact that RMSEs are only marginally lower when an ensemble of 1,000 trees is used instead of 100 trees (0.9% for p=10, out of sample). For the inflation panel in question, varying

注意事项

本文(纸币悖论:理解交易动机以外的现金需求-20页_689kb.pdf)为本站会员(科研)主动上传,报告吧仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知报告吧(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




关于我们 - 网站声明 - 网站地图 - 资源地图 - 友情链接 - 网站客服 - 联系我们

copyright@ 2017-2022 报告吧 版权所有
经营许可证编号:宁ICP备17002310号 | 增值电信业务经营许可证编号:宁B2-20200018  | 宁公网安备64010602000642号


收起
展开