Skip to main content

Dropbox & R Data

I'm always looking for ways to download data from the internet into R. Though I prefer to host and access plain-text data sets (CSV is my personal favourite) from GitHub (see my short paper on the topic) sometimes it's convenient to get data stored on Dropbox.

There has been a change in the way Dropbox URLs work and I just added some functionality to the repmis R package. So I though that I'ld write a quick post on how to directly download data from Dropbox into R.

The download method is different depending on whether or not your plain-text data is in a Dropbox Public folder or not.

Dropbox Public Folder

Dropbox is trying to do away with its public folders. New users need to actively create a Public folder. Regardless, sometimes you may want to download data from one. It used to be that files in Public folders were accessible through non-secure (http) URLs. It's easy to download these into R, just use the read.table command, where the URL is the file name. Dropbox recently changed Public links to be secure (https) URLs. These cannot be accessed with read.table.

Instead you need can use the source_data command from repmis:

FinURL <-""

# Download data
FinRegulatorData <- repmis::source_data(FinURL,
                             sep = ",",
                             header = TRUE)

Non-Public Dropbox Folders

Getting data from a non-Public folder into R was a trickier. When you click on a Dropbox-based file's Share Link button you are taken to a secure URL, but not for the file itself. The Dropbox webpage you're taken to is filled with lots of other Dropbox information. I used to think that accessing a plain-text data file embedded in one of these webpages would require some tricky web scrapping. Luckily, today I ran across this blog post by Kay Cichini.

With some modifications I was able to easily create a function that could download data from non-Public Dropbox folders. The source_DropboxData command is in the most recent version of repmis (v0.2.4) is the result. All you need to know is the name of the file you want to download and its Dropbox key. You can find both of these things in the URL for the webpage that appears when you click on Share Link. Here is an example:

The file name is at the very end (fin_research_note.csv) and the key is the string of letters and numbers in the middle (exh4iobbm2p5p1v). Now we have all of the information we need for source_DropboxData:

FinDataFull <- repmis::source_DropboxData("fin_research_note.csv",
                                  sep = ",",
                                  header = TRUE)


K. Ram said…
You should also just try authenticating with Dropbox directly using my R package, rDrop.. Feedback welcome.
Unknown said…
Yeah, I only just now saw your rDrop package (I added a shout out to it in the repmis README). I'm definitely going to start using it.

Correct me if I'm wrong, but can you use rDrop to access data if you don't have the user's credentials?
Bob Muenchen said…
Thanks for the helpful post! If you find that rDrop does the same two examples (i.e. without authentication), it would make a great followup article.

Unknown said…
this seems to no longer be supported, unfortunately.
shivani said…
A bewildering web journal I visit this blog, it's unfathomably heavenly. Oddly, in this present blog's substance made purpose of actuality and reasonable. The substance of data is informative
Oracle Fusion Financials Online Training
Oracle Fusion HCM Online Training
Oracle Fusion SCM Online Training
PaperCoachNet said…
Great and useful article. Creating content regularly is very tough.Thanks you.Write more with reflective essay thesis.

basha said…
Excellent blog I visit this blog it's really awesome. The important thing is that in this blog content written clearly and understandable. The content of information is very informative.
Magnificent blog I visit this blog it's extremely wonderful. Interestingly, in this blog content composed plainly and reasonable. The substance of data is useful.
Oracle Fusion HCM Online Training
Oracle Fusion SCM Online Training
Oracle Fusion Financials Online Training
Big Data and Hadoop Training In Hyderabad
oracle fusion financials classroom training
Oracle Fusion HCM Classroom Training
oracle cpq online training / Oracle CPQ Class Room Training
Oracle Taleo Online Training

Popular posts from this blog

Slide: one function for lag/lead variables in data frames, including time-series cross-sectional data

I often want to quickly create a lag or lead variable in an R data frame. Sometimes I also want to create the lag or lead variable for different groups in a data frame, for example, if I want to lag GDP for each country in a data frame.I've found the various R methods for doing this hard to remember and usually need to look at old blogposts. Any time we find ourselves using the same series of codes over and over, it's probably time to put them into a function. So, I added a new command–slide–to the DataCombine R package (v0.1.5).Building on the shift function TszKin Julian posted on his blog, slide allows you to slide a variable up by any time unit to create a lead or down to create a lag. It returns the lag/lead variable to a new column in your data frame. It works with both data that has one observed unit and with time-series cross-sectional data.Note: your data needs to be in ascending time order with equally spaced time increments. For example 1995, 1996, 1997. ExamplesNot…

A Link Between topicmodels LDA and LDAvis

Carson Sievert and Kenny Shirley have put together the really nice LDAvis R package. It provides a Shiny-based interactive interface for exploring the output from Latent Dirichlet Allocation topic models. If you've never used it, I highly recommend checking out their XKCD example (this paper also has some nice background).LDAvis doesn't fit topic models, it just visualises the output. As such it is agnostic about what package you use to fit your LDA topic model. They have a useful example of how to use output from the lda package.I wanted to use LDAvis with output from the topicmodels package. It works really nicely with texts preprocessed using the tm package. The trick is extracting the information LDAvis requires from the model and placing it into a specifically structured JSON formatted object.To make the conversion from topicmodels output to LDAvis JSON input easier, I created a linking function called topicmodels_json_ldavis. The full function is below. To use it follow …