Category Archives: OpenLayers

The Great British Bike to Work

Cross-posted from the DataShine blog.

cycle_thumbnail

Here’s a little visualisation created with the DataShine platform. It’s the DataShine Commute map, adapted to show online cycle flows, but all of them at once – so you don’t need to click on a location to see the flow lines. I’ve also added colour to show direction. Flows in both directions will “cancel out” the colour, so you’ll see grey.

London sees a characteristic flow into the centre, while other cities, like Oxford, Cambridge, York and Hull, see flows throughout the city. Other cities are notable for their student flows, typically to campus from the nearby town, such as Lancaster and Norwich. The map doesn’t show intra-zone (i.e. short distance) flows, or ones where there are fewer than 25 cyclists (13 in Scotland as the zone populations are half those in England/Wales) going between each origin/destination zone pair – approximately 0.15% of the combined population.

Visit the Great British Bike to Work Map.

Visit the new oobrien.com Shop
High quality lithographic prints of London data, designed by Oliver O'Brien

named

named_lennon_mccartney

named is a little website that I have recently co-written as part of an ongoing ESRC-funded project on UK surnames that we are conducting here at UCL Department of Geography. I put together the website and adapted for the UK some code on generating heatmaps showing regions of unusual popularity of a surname, that was created by researchers in the School of Computing, Informatics & Decision Systems Engineering at ASU (Arizona State University) in the USA.

The website is deliberately designed to be simple to use and “stripped down” – all you do is enter your surname and the website maps where in the UK there is an unusually high number of people with that surname living. There is also an option to enter an additional surname (for example, a maiden name for yourself or your partner, or the name of a friend) – and, by combining heatmaps of both names, we try and draw out where we think you might have met each other, or grown up together.

The Research

named_tweedy_coleOf most interest to us is the quality of the technique with pairs of surnames. It is well known already (for example, J A Cheshire, P A Longley (2012) Identifying Spatial Concentrations of Surnames, International Journal of GIS 26(2) pp309-325) that most traditional UK surname distributions remain surprisingly unchanged over many years – internal migration in the UK is a lot less than might be traditionally perceived. One of the research questions in the underlying project is to see whether this extends to marriages and other pairings too. So we encourage you to use this mode and help us understand and evaluate pairing surname distributions and patterns.

The site is also a useful information gathering tool – we are only in the early stages of evaluating the validity or accuracy of this method – we know it works well for certain regional UK names which are not too popular or too rare, at least. We ask for optional quick feedback following a search, so we can evaluate if the result feels right for you. So far, with the website been operational for around a week, nearly 10% of people are giving feedback, and around half of those suggest that it is good result for them. If it doesn’t highlight where you live now, it might be showing your ancestral home or other region that you have a historical link to. Or it may be showing complete rubbish – but let us know either way!

named_whyte_mackay

Try it out for yourself – visit here and see what it says for your surname. The site should be quite quick – it will take up to 10 seconds for names which have not already been searched, but is much faster if getting information that’s previously been searched for.

How it Works

The system is creating a probabilistic kernel density estimate (KDE), based on surname distributions (in a postcode) for an old electoral roll. It finds the relatively frequency/density of the surname compared with the general population in the area. So, in most cases, it will often highlight an area in the countryside – a sparse population, but maybe with a cluster of people with that surname. As such, it will only rarely highlight London and the other major cities of the UK, except for exceptionally urban-centric surnames, typically of foreign-origin. The method is not perfect – the “bandwidth” is fixed which means that neighbouring cities and other population fluctuations can cause false-positive results. However, we have seen enough “good” results that we think the simple has some validity, with the structure of the UK’s names.

named1

Design

On a design perspective, I wanted to build a website that looks different from the normal “full screen slippy maps” that I have designed for a lot of my research projects. Maps are normally rectangular, so I played with some CSS and a nice JQuery visual effects library, to create a circular map instead which appears to be on the back of an information disc.

Data Quality and Privacy

The map is deliberately small and low on detail because having a more detailed map would imply a higher level of precision for the underlying names data than can actually be justified. The underlying dataset has issues but is considered to be sufficient for this purpose, as long as the spatial resolution is low. Additionally, for rare names where a result may appear for only a small number of people with that name (when in rural places) we don’t want to be flagging individual villages or houses. The data’s just not good enough for that, for many names (it may well be good for some) and it may imply we are mapping exact data over someone’s house, possibly raising privacy issues – we are not, the data is not good enough for that but by coincidence it may still happen to line up with a very local feature if it was high res.

It should give an indication into the general area where your name is unusually popular relative to the local population there (N.B. not quite the same as where your name is popular in absolute terms) but I would be wary of the quality of the result if you were identifying a particular small town or exact location.

[A little update as one user worried that it was just showing a population heatmap. This would only happen for names which have a higher relative population in more dense area of the UK. Typically, older common foreign origin names will most likely show this, as foreigners traditionally migrate to cities in the UK first. The only name so far that I’ve seen it for (I haven’t tested it for many) is Zhang which is a very common surname. Compare Zhang (left) with an overall population heatmap (using the same buffer and KDE generation as the rest of the maps):

named_zhang_allpop

Some newer foreign origin names show an even more pronounced urban tendency, such as Begum and Mohammed.]

More…

Try named now, or if you are interested in surnames across the world, see the older WorldNames website, and for comparisons between 1881 and 1998 distributions in the UK, see GB Names.

If named shows “No Data” and you have entered a real surname, this may be because there are only very few of you on the UK – and in this case, I show the “No Data” graphic to protect your privacy. Otherwise I’d be mapping your house – or at least, your local neighbourhood.

Visit the new oobrien.com Shop
High quality lithographic prints of London data, designed by Oliver O'Brien

Tube Line Closure Map

anim

[Updated] The Tube Line Closure Map accesses Transport for London’s REST API for line disruption information (both live and planned) and uses the information there to animate a geographical vector map of the network, showing closed sections as lines flashing dots, with solid lines for unaffected parts. The idea is similar to TfL’s official disruption map, however the official one just colours in the disrupted links while greying out the working lines (or vice versa) which I think is less intuitive. My solution preserves the familiar line colours for both working and closed sections.

My inspiration was the New York City MTA’s Weekender disruptions map, because this also blinks things to alert the viewer to problems – in this case it blinks stations which are specially closed. Conversely the MTA’s Weekender maps is actually a Beck-style (or actually Vignelli) schematic whereas the regular MTA map is pseudo-geographical. I’ve gone the other way, my idea being that using a geographical map rather than an abstract schematic allows people to see walking routes and other alternatives, if their regular line is closed.

Technical details: I extended my OpenStreetMap-based network map, breaking it up so that every link between stations is treated separately, this allows the links to be referenced using the official station codes. Sequences of codes are supplied by the TfL API to indicate closed sections, and by comparing these sequences with the link codes, I can create a map that dynamically changes its look with the supplied data. The distruption data is pulled in via JQuery AJAX, and OpenLayers 3 is used to restyle the lines appropriately.

Unfortunately TfL’s feed doesn’t include station closure information – or rather, it does, but is not granular enough (i.e. it’s not on a line-by-line basis) or incorrect (Tufnell Park is shown only as “Part Closed” in the API, whereas it is properly closed for the next few months) – so I’m only showing line closures, not station closures. (I am now showing these, by doing free-text search in the description field for “is closed” and “be closed”.) One other interesting benefit of the map is it allows me to see that there are quite a lot of mistakes in TfL’s own feed – generally the map shows sections open that they are reporting as closed. There’s also a few quirks, e.g. the Waterloo & City Line is always shown as disrupted on Sundays (it has no Sunday service anyway) whereas the “Rominster” Line in the far eastern part of the network, which also has no Sunday service, is always shown as available. [Update – another quirk is the Goblin Line closure is not included, so I’ve had to add that in manually.]

Try it out

General Election Maps for 2015

ge_swingmap

When I first moved to UCL CASA back in 2010, the first online map I created from scratch was one showing swings in the general election that year. So it seemed fitting to update the old code with the data from the 2015 general election, which took place last week. You can see the resulting maps here – use the dropdowns to switch between headline swing, winner, second places, turnout % variations, majorities, political colour and individual party votes and X-to-Y swings.

Screen Shot 2015-05-11 at 15.09.08

My style of Javascript coding back in 2010 was – not great. I didn’t use JQuery or event AJAX, choosing instead to dump the results of the database query straight into the Javascript as the page was loaded in, using PHP. I was also using OpenLayers 2, which required some rather elaborate and unintuitive coding to get the colours/shapes working. My custom background map was also rather ugly looking. You can see what the map looked like in this old blog post. I did a partial tidyup in 2013 (rounded corners, yay!) but kept the grey background and slightly overbearing UI.

Now, in 2015, I’ve taken the chance to use the attractive HERE Maps background map, with some opacity and tinting, and tidied up the UI so it takes up much less of the screen. However, I decided to leave the code as OpenLayers 2 and not AJAX-ify the data load, as it does work pretty well “as is”. The constituency boundaries are now overlaid as a simplified GeoJSON (OL 2 doesn’t handle TopoJSON). For my time map, I was using OL 3 and TopoJSON. Ideally I would combine the two…

Link to the interactive maps.

ge_colourmap

Election Time!

electiontime

I’ve created an Election 2015 Time Map which maps the estimated declaration times that the Press Association have published. It follows on from a similar map of the Scottish independence referendum.

Each constituency is represented by a circle which is roughly in its centre (using a longest-interior-vertex centroid determined in QGIS). The area of the circle represents the size of the electorate, with the Isle of Wight being noticeably larger, and the Western Isles and Orkney/Shetland constituencies smaller, than average. The main colours show the expected time (red = around midnight, falling to green for the slow-to-declare constituencies late in the morning) while the edge colour shows the 2010 winning party. Mouseover a constituency circle for more data. Grey lines shows the constituency boundaries, created from ONS data (for Great Britain) and aggregating NISRA small area and lookup data (for Northern Ireland). You can download the resulting TopoJSON file, which is simplified using MapShaper. The data is Crown Copyright ONS/NISRA.

As the election approaches, and after the results come in, I hope to modify and update the map with other constituency-level data, such as the result itself.

GeoComputation: A Practical Primer

geocomputationGeoComputation: A Practical Primer, edited by Profs Chris Brunsdon and Alex Singleton, has just been published by SAGE.

The book acts both as a reference guide to the field and as a guide to help you get to know aspects of it. Each chapter includes a worked example with step-by-step instructions.

Each chapter has a different author, and includes topics such as spatial data visualisation with R, agent-based modelling, kernel density estimation, spatial interaction models and the Python Spatial Analysis library, PySAL. With 18 chapters, the book runs to over 300 pages and so has the appropriate depth to cover a diverse, active and fast-evolving field.

I wrote a chapter in the book, on open source GIS. I focused particularly on QGIS, as well as mentioning PostGIS, Leaflet, OpenLayers (2) and other parts of the modern open source “geostack”. My worked example describes how to build a map, in QGIS, of London’s railway “not-spots” – places which are further than a mile from a railway station, using open data map files, mainly from the Ordnance Survey. With the guide, you can create a map like the one below:

offthetracks

That little spot on its own in central-ish London, by the way, is part of Burgess Park, near Peckham.

The book has only just been published and I was able to slip in brand new screenshots (and slightly updated instructions) just before publication, as QGIS 2.6 came out late last year. So, the book is right up to date, and as such now is a great time to get your copy!

It’s available now in paperback on Amazon: Geocomputation: A Practical Primer.

The first part of my chapter:

geocomp_ch17

OpenLayers 3 and DataShine

ds_ol3

OpenLayers is a powerful web mapping API that many of my websites use to display full-page “slippy” maps. DataShine: Census has been upgraded to use OpenLayers 3. Previously it was powered by OpenLayers 2, so it doesn’t sound like a major change, but OL3 is a major rewrite and as such it was quite an effort to migrate to it. I’ve run into issues with OL3 before, many of which have since been resolved by the library authors or myself. I was a bit grumbly in that earlier blogpost for which I apologise! Now that I have fought through, the clouds have lifted.

Here are some notes on the upgrade including details on a couple of major new features afforded by the update.

New Features

Drag-and-drop shapes

One of the nicest new features of OL3 is drag-and-dropping of KMLs, GeoJSONs and other geo-data files onto the map (simple example). This adds the features pans and zooms the map to the appropriate area. This is likely most useful for showing political/administrative boundaries, allowing for easier visual comparisons. For example, download and drag this file onto DataShine to see the GLA boundary appear. New buttons at the bottom allow for removal or opacity variation of the overlay files. If the added features include a “name” tag this appears on the key on the left, as you “mouse over” them. I modified the simple example to keep track of files added in this way, in an ol.layer.Group, initially empty when added to the map during initialisation.

Nice printing

Another key feature of OL3 that I was keen to make use of is much better looking printing of the map. With the updated library, this required only a few tweaks to CSS. Choosing the “background colours” option when printing is recommended. Printing also hides a couple of the panels you see on the website.

Nice zooming

OL3 also has much smoother zooming, and nicer looking controls. Try moving the slider on the bottom right up and down, to see the smooth zooming effect. The scale control also changes smoothly. Finally, data attributes and credits are now contained in an expandable control on the bottom left.

A bonus update, unrelated to OL3, is that I’ve recreated the placename labels with the same font as the DataShine UI, Cabin Condensed. The previous font I was using was a bit ugly.

Major reworkings to move from OL2 to OL3

UTF Grids

With OpenLayers 3.1, that was released in December 2014, a major missing feature was added back in – support for UTF Grid tiles of metadata. I use this to display the census information about the current area as you “mouse over” it. The new implementation wasn’t quite the same as the old though and I’ve had to do a few tricks to get it working. First of all, the ol.source.TileUTFGrid that your UTF ol.layer.Tile uses expects a TileJSON file. This was a new format that I hadn’t come across before. It also, as far as I can tell, insists on requesting the file with a JSONP callback. The TileJSON file then contains another URL to the UTF Grid file, which OL3 also calls requiring a JSONP callback. I implemented both of these with PHP files that return the appropriate data (with appropriate filetype and compression headers), programmatically building “files” based on various parameters I’m sending though. The display procedure is also a little different, with a new ol.source.TileUTFGrid.forDataAtCoordinateAndResolution function needing to be utilised.

In my map initialisation function:

layerUTFData = new ol.layer.Tile({});

var handleUTFData = function(coordinate)
{
  var viewResolution = olMap.getView().getResolution();
  layerUTFData.getSource().forDataAtCoordinateAndResolution(coordinate, viewResolution, showUTFData);
}

$(olMap.getViewport()).on('mousemove', function(evt) {
  var coordinate = olMap.getEventCoordinate(evt.originalEvent);
  handleUTFData(coordinate);
});

In my layer change function:

layerUTFData.setSource(new ol.source.TileUTFGrid({
  url: "http://datashine.org.uk/utf_tilejsonwrapper.php?json_name=" + jsonName
})

(where jsonName is how I’ve encoded the current census data being shown.)

Elsewhere:

var callback = function(data) { [show the data in the UI] }

In utf_tilejsonwrapper.php:

<?php
header('Content-Type: application/json');
$callback = $_GET['callback'];
$json_name = $_GET['json_name'];
echo $callback . "(";
echo "
{ 'grids' : ['http://datashine.org.uk/utf_tilefilewrapper.php?x={x}&y={y}&z={z}&json_name=$json_name'],
'tilejson' : '2.1.0', 'scheme' : 'xyz', 'tiles' : [''], 'version' : '1.0.0' }";
echo ')';
?>

(tilejson and tiles are the two mandatory parts of a TileJSON file.)

In utf_tilefilewrapper.php:

<?php
header('Content-Type: application/json');
$callback = $_GET['callback'];
$z = $_GET['z'];
$y = $_GET['y'];
$x = $_GET['x'];
$json_name = $_GET['json_name'];
echo $callback . "(";
echo file_get_contents("http://[URL to my UTF files or creator service]/$json_name/$z/$x/$y.json");
echo ')';
?>

Permalinks

The other change that required careful coding to recreate the functionality of OL2, was permalinks. The OL3 developers have stated that they consider permalinks to be the responsibility of the the application (e.g. DataShine) rather than the mapping API, and, to a large extent, I agree. However OL2 created permalinks in a particular way and it would be useful to include OL3 ones in the same format, so that external custom links to DataShine continue to work correctly. To do this, I had to mimic the old “layers”, “zoom”, “lat” and “lon” parameters that OL2’s permalink updated, and again work in my custom “table”, “col” and “ramp” ones.

Various listeners for events need to be added, and functions appended, for when the URL needs to be updated. Note that the “zoom ended” event has changed its name/location – unlike moveend (end of a pan) which sits on your ol.map, the old “zoomend” is now called change:resolution and sets on olMap.getView(). Incidentally, the appropriate mouseover event is in an OL3-created HTML element now – olMap.getViewport() – and is mousemove.

Using the permalink parameters (args):

if (args['layers']) {
  var layers = args['layers'];
  if (layers.substring(1, 2) == "F") {
    layerBuildMask.setVisible(false);
  }
  [etc...]
}
[& similarly for the other args]

On map initialisation:

args = []; //Created this global variable elsewhere.
var hash = window.location.hash;
if (hash.length > 0) {
  var elements = hash.split('&');
  elements[0] = elements[0].substring(1); /* Remove the # */
  for(var i = 0; i < elements.length; i++) {     var pair = elements[i].split('=');     args[pair[0]] = pair[1];   } }

Whenever something happens that means the URL needs an update, call a function that includes this:

var layerString = "B"; //My old "base layer"
layerBuildMask.getVisible() ? layerString += "T" : layerString += "F";
[etc...]
layerString += "T"; //The UTF data layer.
[...]
var centre = ol.proj.transform(olMap.getView().getCenter(), "EPSG:3857", "EPSG:4326");
window.location.hash = "table=" + tableval + "&col=" + colval + "&ramp=" + colourRamp + "&layers=" + layerString + "&zoom=" + olMap.getView().getZoom() + "&lon=" + centre[0].toFixed(4) + "&lat=" + centre[1].toFixed(4);
}

Issues Remaining

There remains a big performance drop-off in panning when using DataShine on mobile phones and other small-screen devices. I have put in a workaround "viewport" meta-tag in the HTML which halves the UI size, and this makes panning work on an iPhone 4/4S, viewed horizontally, but as soon as the display is a bit bigger (e.g. iPhone 5 viewed horizontally) performance drops off a cliff. It's not a gradual thing, but a sudden decrease in update-speed as you pan around, from a few per second, to one every few seconds.

Additional Notes

Openlayers 3 is compatible with Proj4js version 2 only. Using this newer version requires a slightly different syntax when adding special projections. I use Proj4js to handle the Ordnance Survey GB projection (aka ESPG:27700), which is used for the postcode search, as I use a file derived from the Ordnance Survey's Code-Point Open product.

I had no problems with my existing JQuery/JQueryUI-based code, which powers much of the non-map part of the website, when doing the upgrade.

Remember to link in the new ol.css stylesheet, or controls will not display correctly. This was not needed for OL2.

OL3 is getting there. The biggest issue remains the sparsity of documentation available online - so I hope the above notes are helpful in the interim.

ds_ol3overlay2

Above: GeoJSON-format datafiles for tube lines and stations (both in blue), added onto a DataShine map of commuters (% by tube) in south London.

OpenLayers 3

ol-logo

As a learning exercise, I been trying to “migrate” my recent #indyref map from OpenLayers 2.13.1 to the very new version 3.0.0 of the popular mapping API. It seemed a good time to learn this, because the OpenLayers website now shows v3 as the default version for people to download and use. Much of my output in the last few years has been maps based on OpenLayers, so I have considerable interest in the new version. There are some production sites using OpenLayers 3 already – for example, the official Swiss map.

I use the term “migrate” in inverted commas, because, really, OpenLayers 3 is pretty much a rewrite, with an altered object model, and accordingly requires coding from scratch a new map rather than just changing a few lines. It has so far taken me four times as long to do the conversion, as it did to create the original map, although that is an inevitable consequence of learning as I go along.

I’ll update this blogpost as I discover workarounds.

Shortcomings in v3 that I have come across so far:

  • No Permalink control. This is unfortunate, particularly as “anchor” style permalinks, which update as you move around the map, are very useful for visualisations like DataShine where people share specific views and places, and I can inject extra parameters in. The site linked above suggests this is a feature that should not be in the core mapping library, but instead an additional library can query/construct necessary parameters. Perhaps, but I think layer/zoom/lat/lon parameters are such a key part of a map (as opposed to other interactive content) that they still deserve to be treated specially.
  • The online documentation, particularly the apidoc, is very sparse in places. As mentioned above, there is also some mismatching in functionality suggested in the online tutorials, to what is actually available. Another example, the use of “font” instead of “fontSize” and “fontStyle” for styles. This will improve I am sure, and there is at least one book available on OpenLayers 3, but it’s still a little frustrating at this stage.
  • Label centering on the circle vectors is not as good as with OL 2. This is possibly due to antialiasing of the circle itself. You can see the labels “jump” slightly when comparing the two versions – see links below.
  • Much, much slower on my iPhone 4 (and also on a friend’s Android phone). This is not what I was expecting! This is the “killer” problem for me which means I’ve kept my map on OL 2 for now. Wrapping my vector layer in an Image layer is supposed to speed things up, but causes the layer not to display on my iPhone. Disabling the potentially expensive mousemove listener did not make a difference. Adding a viewport meta tag with width=device-width speeded things up a lot so that it was almost as fast as OL 2 (without the meta tag) but then I would need to rewrite my own UI for mobile – something I don’t need to do with the OL 2 version!
  • No support (yet) for UTFGrids. These are a form of vector tiles, for metadata rather than geographic features, which I use on the DataShine project.

Things which I like about the new version:

  • Smooth vector resizing/repositioning when zooming in/out on a computer. (N.B. This is only when using a Vector layer and a Vector source, rather than Image layer with an ImageVector source that itself uses a Vector source.)
  • Attribution is handled better, it looks nicer.
  • No need to have a 100% width/height on the map div any more.
  • Resolution-specific styling. I’ve used this to hide the labels when zoomed out beyond a certain amount.
  • Can finally specify (in a straightforward fashion) a minimum zoom level.
  • Point coordinates and extents/bounds are specified in a much simpler way.
  • On a more general note, the new syntax is more complete and feels less “hacky”. The developers have taken the opportunity to do it “right” and remove inconsistencies, misplaced functionality and other quirks from the old version. For example, separating out visual UI controls and interaction management controls into two separate classes.
  • Drag-and-drop addition of KML/GeoJSON vector features. Example (use this file as a test).

Some gotchas, which got me for a bit, but I was able to solve:

  • You need to link in a new ol.css stylesheet, not just the Javascript library, in order to get the default controls to display and position correctly.
  • Attribution information is attached to a source object now, not directly to the layer. A layer contains a source.
  • Attribute-based vector styling is a lot more complicated to specify. You need to create a function which you feed in to an attribute. The function has to return a style wrapped in an array – this may be the closure syntax in Javascript that I have not come across before.
  • Hover/mouseover events are not handled directly by OpenLayers any more – but click events are, so the two event types require quite different setups.
  • Minor differences between the debug and regular versions of the library. The example I noticed is that the debug version allows ol.control.ScaleLineUnits.METRIC to be specified as an attribute for the ScaleLine control, but the non-debug version needs to use an explicit string “metric”.
  • No opacity control on individual styles – only on layers. This means I can’t have the circles with the fill at 80% opacity but the text at 100% opacity. Opacity can be set on styles, but has to be specified as part of the colour, in RGBA format (where A is the alpha, i.e. opacity, you want) rather than as a separate attribute. This is contrary to the tutorials on the website. Layer opacity can continue to be specified as seperate attributes.

My OpenLayers 3 version of the #indyref map is here – compare with the OpenLayers 2 one. Note that, since first writing this blogpost, I’ve subsequently updated the OpenLayers 2 one to change the cartography there further.

DataShine Travel to Work Flows

datashinecommute

Today, the Office for National Statistics (ONS) have released the Travel to Work Flows based on the 2011 census. These are a giant origin-destination matrix of where people commute to work. There are various tables that have been released. I’ve chosen the Method of Travel to Work and visualised the flows, for England and Wales, on this interactive map. The map uses OpenLayers, with an OpenStreetMap background for context. Because we are showing the flows and places (MSOA population-weighted centroids) as vectors, a reasonably powerful computer with a large screen and a modern web browser is needed to view the map. The latest versions of Firefox, Safari or Chrome should be OK. Your mobile phone will likely not be so happy.

Blue lines represent flows coming in to a selected place, that people work in. Red lines show flows out from the selected location, to work elsewhere.

The map is part of the DataShine platform, an output of the BODMAS project led by Dr Cheshire, where we take big, open datasets and analyse them. The data – both the travel to work flows and the population-weighted MSOA centroids – come from from the ONS, table WU03EW.

View the interactive map here.

lichfieldcommute

FOSS4G 2013 Conference

IMG_4958

Well, that was good.

September this year was Maptember with numerous conferences with a geographical flavour taking place in the East Midlands. The undoubted highlight for me was FOSS4G 2013, the annual conference for OSGeo which travels around the world, this year it was conveniently in Nottingham, so I was able to make it along relatively easily. FOSS4G is Free and Open Source Software for GIS and as such the conference is a good mix of open-source technology and geography.

As I will be spending some time this month writing a book chapter on open source GIS, the conference was an unmissable event for me, even though a clash with another conference (ECCS) abroad meant logistics were tricky – in the end, a 6am wakeup call necessitated and lots of freshly ground coffee (very big thumbs up to the conference for that – a first) helped me out.

Just over 800 people attended the conference and there were up to 9 parallel streams. With many talks sounding very interesting it was often hard to pick a track to follow, not least as there was a 10 minute walk between the two main conference venues. I had brought my bike up from London, which helped.

Highlights of the conference for me were:

  • A keynote by Ben Hennig of Worldmapper fame on the need for the Open Source geospatial software community to remember about the cartography – the gist being just because you have the tools to map, doesn’t always mean you jump straight in without thinking about the better picture.
  • IMG_4959Keynotes by the two top sponsors at the conference – the Ordnance Survey and the Met Office. Both sponsors knew who they were talking to, and pitched the technical level appropriately. At both organisations, the open source ecosystem is pushing in from the sides and slowly becoming a core asset. Both also have large open datasets ready for crunching in your open source GIS of choice.
  • QGIS 2. This was launched at the conference. I’ve always been a fan of this open source GIS in particular (there are others available, including the venerable GRASS, uDIG etc), in no short part because of its excellent integration with PostGIS, that it works well on the Mac and that it is extendable and drivable with Python. Also, excitingly for the project in the longer time, the developer time and effort has ramped up recently – it’s reassuring to be using an open source application with a large and enthusiastic team beside it. Also – it’s not called Quantum anymore, although it’s going to take me a while to stop accidentally still calling it that.
  • OpenLayers 3. The first beta of this was also launched at the conference. I have long been a fan on OpenLayers, having regarded it as a richer and more powerful web mapping API than the Google Maps API, and have used its vector styling capabilities extensively. However, it has somewhat had its lunch stolen from it by Leaflet and by Google Maps continuously innovating, so it was due a rewrite – and OpenLayers 3 looks to be that rewrite!
  • IMG_4956PostGIS/PostgreSQL. There were a number of PostGIS talks, almost all of which were massively oversubscribed – not sure why they were in one of the smallest venues – one even got a representation later! PostGIS is another enormously impressive bit of open source technology, and the rapid-fire demonstration of what was new made me realise I really need to move forward and update my old version! (& do more cool stuff with it.)
  • The final talk before the closing session was by a tech person at ESRI. He had an awful lot to say in 20 minutes, and consequently overran, but had numerous interesting things to say on JavaScript geo libraries, many of which he lamented hadn’t been covered much (or at all) in the conference – I agree, but the conference did have to pare down nearly 400 submissions to under 200 at the event – such as TopoJSON, Node JS, JS Topology Suite, Shapefile.js, or D3. He did bash QGIS a bit which didn’t go down very well, but to be fair some of the QGIS talks had previously bashed ESRI a lot, which wasn’t called for… Good for ESRI for making the effort to come, even if (or indeed because) QGIS is rapidly becoming a serious competitor.
  • The conference food – it was excellent.
  • Catching up with a bunch of people in the community, not just the OSMers – e.g. Rollo (OS), Addy (Edina), Andy, Ben. Andy showed me some new OpenStreetMap renderings which use some advanced cartographic techniques in Mapnik and look great. Mapnik was another topic that I missed from the conference.
  • Evening tour of Nottingham by SK53 (actually just the leg from the curry house to the Ye Olde Trip To Jerusalem, but we went an interesting way.) SK53 has also written up in detail a blog post based in part on a comment I made!
  • IMG_4944The CASA iPad Wall (which was the other reason I was there) was showing, Ken Burns style, the various submissions to the map competition. In the end, the wall pretty much ran itself, thanks to careful stewardship by the Ordnance Survey who had requested it, and some high quality code that had been written for the display. Interestingly, Wired covered the conference, and focused on the iPad Wall, which really was quite a minor, albeit cool, part of the conference.
  • Winning a green glass globe paperweight for my submission to the aforementioned competition, namely the global version of my Bike Share Map – “Best Web Map”. This was completely unexpected, indeed I was already on a train back to London, having left just before the announcement, and found out through Twitter. “Singing” legend Gregory is, I hope, keeping careful stewardship of the globe and I will grab it in due course.

There’s a lot I didn’t get to see – Cartopy/Iris, more CartoDB, plus lots of interesting sounding papers presented on the integrated academic track.

This could have been the best conference I’ve ever been to. Ever. Well done to the organising team – I know they worked incredibly hard to deliver, but it was very definitely worth it.