Skip to main content

European Parliament Candidates Have a Unique Opportunity to Advocate for Banking Union Transparency and Accountability

This is reposted from the original on the Hertie School of Governance European Elections blog.

The discussion of issues around the European Parliament Elections has been beating around the bush for quite some time now. Karlheinz Reif and Hermann Schmitt famously described European Elections as ”second-order elections”, in that they are secondary to national elections. A few weeks ago on this blog Andrea Römmele and Yann Lorenz argued that the current election cycle has been characterised by personality politics between candidates vying for the Commission presidency, rather than substantive issues.

However, the election campaigns could be an important opportunity for the public to express their views on and even learn more about one of the defining changes to the European Union since the introduction of the Euro: the European Banking Union.

Much of the framework for the Banking Union has been established in the past year after intense debate between the EU institutions. A key component of the Union is that in November 2014, the European Central Bank (ECB) will become the primary regulator for about 130 of the Euro area’s largest banks and will have the power to become the main supervisor of any other bank, should it deem this necessary to ensure ”high standards”.

A perennial complaint made against the EU is that it lacks transparency and accountability. While there are many causes of this (not least of which is poor media coverage of EU policy-making), the ECB’s activities in the Banking Union certainly are less than transparent according to the rules currently set out. As Prof. Mark Hallerberg and I document in a recent Bruegel Policy Note, financial regulatory transparency in Europe and especially the Banking Union is very lacking. Unlike in another large banking union –the United States, where detailed supervisory data is released every quarter – the ECB does not plan to regularly release any data on the individual banks it supervises.

This makes it difficult for citizens, especially informed watchdog groups, to independently evaluate the ECB’s supervisory effectiveness before it is too late, i.e. before there is another crisis.

The European Parliament has been somewhat successful in improving the transparency and accountably (paywall) of the ECB’s future supervisory activities. Unlike originally proposed, the Parliament now has the power to scrutinise the ECB’s supervisory activities. It will nonetheless be constrained by strict confidentiality rules in its ability to freely access information and publish the information it does find.

In our paper, we also show how a lack of supervisory transparency is not exclusive to EU supervisors – the member state regulators, who will still directly oversee most banks, are in general similarly opaque. We found that only 11 (five in the Eurozone) out of 28 member states regularly release any supervisory data. Member state reporting of basic aggregate supervisory data to the European Banking Authority is also very inconsistent.

European Parliamentarians could use the increased attention that they receive during the election period to improve public awareness of the important role they have played in improving the transparency and accountability of new EU institutions. Perhaps, after the election, they could even use popular support that they may build for these activities during the election period to get stronger oversight capabilities and improve financial supervisory transparency in the European Banking Union.

Comments

Popular posts from this blog

Slide: one function for lag/lead variables in data frames, including time-series cross-sectional data

I often want to quickly create a lag or lead variable in an R data frame. Sometimes I also want to create the lag or lead variable for different groups in a data frame, for example, if I want to lag GDP for each country in a data frame.I've found the various R methods for doing this hard to remember and usually need to look at old blogposts. Any time we find ourselves using the same series of codes over and over, it's probably time to put them into a function. So, I added a new command–slide–to the DataCombine R package (v0.1.5).Building on the shift function TszKin Julian posted on his blog, slide allows you to slide a variable up by any time unit to create a lead or down to create a lag. It returns the lag/lead variable to a new column in your data frame. It works with both data that has one observed unit and with time-series cross-sectional data.Note: your data needs to be in ascending time order with equally spaced time increments. For example 1995, 1996, 1997. ExamplesNot…

A Link Between topicmodels LDA and LDAvis

Carson Sievert and Kenny Shirley have put together the really nice LDAvis R package. It provides a Shiny-based interactive interface for exploring the output from Latent Dirichlet Allocation topic models. If you've never used it, I highly recommend checking out their XKCD example (this paper also has some nice background).LDAvis doesn't fit topic models, it just visualises the output. As such it is agnostic about what package you use to fit your LDA topic model. They have a useful example of how to use output from the lda package.I wanted to use LDAvis with output from the topicmodels package. It works really nicely with texts preprocessed using the tm package. The trick is extracting the information LDAvis requires from the model and placing it into a specifically structured JSON formatted object.To make the conversion from topicmodels output to LDAvis JSON input easier, I created a linking function called topicmodels_json_ldavis. The full function is below. To use it follow …

Dropbox & R Data

I'm always looking for ways to download data from the internet into R. Though I prefer to host and access plain-text data sets (CSV is my personal favourite) from GitHub (see my short paper on the topic) sometimes it's convenient to get data stored on Dropbox. There has been a change in the way Dropbox URLs work and I just added some functionality to the repmis R package. So I though that I'ld write a quick post on how to directly download data from Dropbox into R. The download method is different depending on whether or not your plain-text data is in a Dropbox Public folder or not.Dropbox Public FolderDropbox is trying to do away with its public folders. New users need to actively create a Public folder. Regardless, sometimes you may want to download data from one. It used to be that files in Public folders were accessible through non-secure (http) URLs. It's easy to download these into R, just use the read.table command, where the URL is the file name. Dropbox recent…