Super easy heatmaps of postcodes

Whatever I want to do, there are always intrepid explorers who’ve been there and blogged it, and so the satisfaction of my long held desire to get to know more about how Nottinghamshire Healthcare’s services are spread geographically has been wonderfully expedited by these amazing blog posts.

Special thanks, of course, go to David Kahle and Hadley Wickham, progenitors of the mighty ggmap package and also to the fine folk at geonames who freely distribute postcodes from around the world in .csv format.

With the thanks out of the way, there’s almost no work for me to do at all, and I’ve produced this lovely heatmap with absolutely minimal coding. I can’t tell you what it represents, I’m afraid, because I haven’t cleared the data for release, and actually it doesn’t represent anything particularly interesting at the moment. I need to do some preparation of the data but I naturally did this bit first because it’s more fun.

Click to expand!

library(ggmap)

myUni=mydata[!duplicated(mydata$ClientID),] # produce dataframe with unique individuals

mywhere=merge(myUni, mycodes, by.x="ClientHomePostcode",
              by.y="Postcode", all=FALSE) # merge with postcode data

### Plot!

map.center = geocode("Nottingham, UK") # Centre map on Nottingham

myMap = qmap(c(lon=map.center$lon, lat=map.center$lat),
              source="google", zoom=10) # download map from Google

myMap + stat_bin2d(bins=80, aes(x=Long, y=Lat), alpha=.6, data=mywhere) +
  scale_fill_gradient(low = "blue", high ="red")
  # plot with a bit of transparency

Note finally that you can use Google’s map API to give you latitudes and longitudes from postcode data (using the geocode() function), but you are limited to 2500 queries per day. I had many more than that so I needed to download the postcode data.

Easy tables for publication

I’m writing for a journal article today and it’s the familiar problem of how to export all of the beautiful tables and graphs from LaTeX to Word (using R, of course).

It’s surprisingly easy to get a table from R into Word (or Open Office, or…). Just export the table to a text file and use tab-delimiting:

write.table(mytable1, "Summary.txt", sep="t")

Then simply copy the table into the file, select “Convert text to table”, make sure “tab-delimiting” is selected, which it probably is, and voila.

You can be even sneakier than that though without too much effort.

I’ve been pasting together the familiar “mean (sd)” format used in tables quite easily like so:

varNames=c("H.Tot", "C.tot", "R.tot", "HCRtot")

sapply(varNames, function(m) rbind(paste(round(tapply(mydata[,m], mydata[,"Dir"], mean, na.rm=TRUE), 2),
    " (", round(tapply(mydata[,m], mydata[,"Dir"], sd, na.rm=TRUE), 2), ")", sep="")))

Somehow coming up with generic solutions for transferring tables is a lot less depressing than sitting typing them out or manually tidying up SPSS output.

Apologies for the poor code formatting, incidentally, I can’t seem to work out how to format correctly on the blog. My version on RStudio is a lot more readable.

EDIT:

I’m glad I did this, because I’ve just been told that I put a variable in that I shouldn’t have done and so I have to re-do all the tables! Looks like I was right about generic solutions and typing!