Skip to main content

Partisan Bias in Fed Inflation Forecasts?

Following on from my previous post about US Federal Reserve inflation forecast errors, I decided to put together a descriptive graph to see if there might be a partisan bias to these forecast erros. Also, given all of the work in the political economy on political business cycles, I wanted to see if forecast errors changed around elections. (See the previous post for what I mean by Fed inflation forecast error.)  

So, I have two questions:

  1. Have Fed inflation forecast errors been different during Democratic and Republican presidencies?
  2. Are Fed inflation forecast errors different for election periods and non-election periods?

To answer these questions, I simply made a graph using the same inflation forecast error data as before, but arranged in terms of quarters before a US presidential election (quarters with elections are coded 0). I then coloured the inflation errors by the sitting president's party. Finally, I used R's ggplot2 loess function to summarise and compare the errors made during Democratic and Republican presidencies.

Question 1: The Fed did tend to overestimate inflation during Democratic presidencies and underestimate it during Republican presidencies (an Error/Actual score of 0 means that the forecasters perfectly predicted actual inflation). Admittedly we have a pretty small sample of Democratic presidencies (only Carter and Clinton), but it is striking how all of the big underestimates were during Republican presidencies and almost all of the big overestimates were when Democrats had power.

Maybe, Federal Reserve staff anticipate--to an incorrect degree--that Democratic presidents will pursue expansionary policies and vice versa.

Question 2: It is not as clear that forecasts systematically differ in election periods as opposed to non-election periods. Though the spread of the errors across parties does shrink very close to the election. I wonder why this might be?

More to come . . .

The R code to reproduce the plot is:


Pet food processing often organize various seminars and fairs to promote their pet foods. At these fairs, they display their products and attract the attention of the general public.
Unknown said…
We Provide Verified and Trusted textile consultants Across India. Get Connect With Our 660+ Consultants to Solve Your Industrial Problems Today.
This comment has been removed by the author.
Looking FSSAI Consultants, Food Safety License Consultants, FSSAI Registration Consultants, FSSAI License Consultants. Connect with 1500+ Consultants to Assist You.
FSSAI Consultants
Connect with 600+ energy consultants &amp Solar Consultants in India. Our Energy Consultants and Solar Consultants Help You Address Your End-to-End Requirements. said…
For the Awesome information It really helps us in the Food Business MyFoodXpert is the Best Food Packaging Consultancy in Hyderabad Which provides the Best Services that helps in all the Food Businesses.
FoodResearch said…
We need food to grow. The food we eat contains the nutrients that our bodies need to replace worn-out cells, stay healthy, and stay strong.

Food Research Lab is a Global Contract R&D Food, Beverages & Nutraceutical Lab providing solutions to Food, Beverages and Nutraceuticals (F, B&N) industries worldwide.

Food Research Lab makes your dream concept a commercial product, integrating our strong knowledge of ingredients and processing techniques to help you make the right decisions.
For the Amazing Blog
For Best Inhibited Glycol visit
Searching For Best With Low Cost and High Profit? Discuss with 8,000+ Manufacturing Consultants to Assist You to Setup Profitable Business.

Popular posts from this blog

Showing results from Cox Proportional Hazard Models in R with simPH

Update 2 February 2014: A new version of simPH (Version 1.0) will soon be available for download from CRAN. It allows you to plot using points, ribbons, and (new) lines. See the updated package description paper for examples. Note that the ribbons argument will no longer work as in the examples below. Please use type = 'ribbons' (or 'points' or 'lines' ). Effectively showing estimates and uncertainty from Cox Proportional Hazard (PH) models , especially for interactive and non-linear effects, can be challenging with currently available software. So, researchers often just simply display a results table. These are pretty useless for Cox PH models. It is difficult to decipher a simple linear variable’s estimated effect and basically impossible to understand time interactions, interactions between variables, and nonlinear effects without the reader further calculating quantities of interest for a variety of fitted values. So, I’ve been putting together th

Slide: one function for lag/lead variables in data frames, including time-series cross-sectional data

I often want to quickly create a lag or lead variable in an R data frame. Sometimes I also want to create the lag or lead variable for different groups in a data frame, for example, if I want to lag GDP for each country in a data frame. I've found the various R methods for doing this hard to remember and usually need to look at old blog posts . Any time we find ourselves using the same series of codes over and over, it's probably time to put them into a function. So, I added a new command– slide –to the DataCombine R package (v0.1.5). Building on the shift function TszKin Julian posted on his blog , slide allows you to slide a variable up by any time unit to create a lead or down to create a lag. It returns the lag/lead variable to a new column in your data frame. It works with both data that has one observed unit and with time-series cross-sectional data. Note: your data needs to be in ascending time order with equally spaced time increments. For example 1995, 1996

Dropbox & R Data

I'm always looking for ways to download data from the internet into R. Though I prefer to host and access plain-text data sets (CSV is my personal favourite) from GitHub (see my short paper on the topic) sometimes it's convenient to get data stored on Dropbox . There has been a change in the way Dropbox URLs work and I just added some functionality to the repmis R package. So I though that I'ld write a quick post on how to directly download data from Dropbox into R. The download method is different depending on whether or not your plain-text data is in a Dropbox Public folder or not. Dropbox Public Folder Dropbox is trying to do away with its public folders. New users need to actively create a Public folder. Regardless, sometimes you may want to download data from one. It used to be that files in Public folders were accessible through non-secure (http) URLs. It's easy to download these into R, just use the read.table command, where the URL is the file name