A lot of data analysts will find themselves doing repetitive manual tasks on a data set every day/week/month in Excel, then copying and pasting their updated pivot tables and charts into Word or PowerPoint reports for their stakeholders. If this sounds like your job description, you may want to consider switching to a programming language like R. Writing scripts will allow you to automate the majority of these processes; from importing your data all the way through to emailing your boss the final report.
Background Last June I did a blog post about building dot-denisty maps in R using UK Census data. It has proven to be a fairly popular post, most likely due to the maps looking like something you’re more likely to see in the Tate Modern… Not only do these maps looks beautiful, but there is a strong argument that they do a better job of representing data compared to the more common choropleth methods of filling geographical regions with one colour based on one variable.
I Know What You Vizzed Last Summer tl;dr click the image to launch the app I guess I’m of that school of thought, I don’t mind my mobile tracking me. As long as I don’t go breaking the law, or tweeting an ill-advised truth about a politician, it’s unlikely that anyone will be typing the Google Location of my front room into a cruise missile control unit. But I confess a stirring of nerves when I decided to map my own Google Location data using R’s Leaflet package.
Synopsis Market Research is great at compiling the right data, but not so good at making it easy to use. This isn’t about “storytelling”. It’s about the data itself, and clarity on what delivering it actually means. Getting the data out of siloes like tabs and SPSS where it cannot adequately be mined, and out into the big wide world as Tidy Data.
A Dive Into Some Global Flooding Data I always like to keep a look out for interesting open data sets. One great resource for such things is Jeremy Singer-Vine’s Data is Plural weekly newsletter that brings together a collection of “useful, curious datasets” for us all to enjoy and wrangle with. One that cropped up last week was The Dartmouth Flood Observatory’s Global Archive of Large Flood Events.