AGIScotland 2013 – New directions in Geo

The 2013 AGI Scotland event marked a slight change in direction for the AGI, this being the first “showcase” event that they have run. 6 showcase events and the annual GeoCommunity event are scheduled across the year.

It was fitting that the first plenary speaker was from the Scottish Government. Mike Neilson is the Director of Digital and represents the top end of the digital restructuring that has occurred in the Scottish Government. Mike reinforced the importance of digital in governing a country and that there was a push to make more public services available on line. This would encourage the public to get online, but Mike was acutely aware that there was a danger that moving services online would exclude those who could not get online, perhaps due to financial constrains. Improving digital connectivity was important as Scotland, especially Glasgow, currently lags behind the UK average which impacts on the social and economic development of the Country.

At a recent meeting of the Spatial Information Board, 6 priorities were agreed and these will form the focus of activities in the immediate future. These are:

  1. effective use of spatial data thru inspire
  2. data sharing and collaborative procurement
  3. build GIS capabilities capacity
  4. embed spatial data within broader data agenda
  5. promote awareness of benefits of wider use of spatial
  6. mechanism for hosting spatial data

The restructuring of digital data teams in the government seems to make sense and looks to provide sensible, hierarchical structure. However, the Scottish Government are looking for feedback and input from the GI community on what they see as being important and where they think digital data is going.  To provide feedback you can contact shonna or follow them on Twitter @digitalscots

The second plenary speaker was Anne Kemp, Atkins. Anne pointed to the changing role of the GI professional and urged us to step out of our insular groups and comfort zones and to interact with other groups who use spatial data. Anne strongly believes that Building Information Models (BIMs) are the future for many aspects of GIS. BIMs focus on the lifecycle of anything in the built environment, from planning to operational management. Calculations suggest that effective use of BIMs can save 20% in the cost of construction and operation of new infrastructure.  The use of BIMs has been mandated by the government for England and organisations, such as the Environment Agency and Highways Agency, are currently aligning themselves to meet the 2016 target. Interestingly the Scottish Government does not have a similar mandate and seems to have no plan to do so. This raises interesting questions. Many large engineering companies and consultancies are GB wide organisations and tend to operate to organisation wide best practices, of which BIM is almost certainly going to be. Will much of BIMs seems to just represent industry best practice, mandate from central government which then filters down through local government would ensure best practice and potentially interoperability across infrastructure. Certainly the feeling from the floor was that if BIM was being adopted wholesale south of the border and that BIM management was seen as an exportable skill-set, it might be sensible to mandate it in Scotland as well. (cough trams, cough cough Scottish parliament, cough).

Next up was a double act from SEPA’s Dave Watson and Duncan Taylor who introduced Scotland’s Environment Web (http://www.environment.scotland.gov.uk/).  Scotland’s Environment Web (SEWeb) brings together information on Scotland’s environment. It merges environmental data, information and reports, from known and trusted sources, so they can all be viewed in one place. SEWeb links to 30 WMS which are organized in themed groups. Dave and Duncan outlined the pro’s and con’s of this approach.

Good:

  • Each organization is responsible for their own data
  • Reduces development time and maintenance
  • Maintains 1 version of the truth
  • No singl point of failure

Bad:

  • Many points of failure which it is hard to track and sometimes confusing for the user to know who to contact if there are problems

Ugly:

  • No standard look and feel to symbology and styles
  • Issues with data scales.

The current work represents Phase One. Phase Two will allow users to download data and there is a business case to support forestry assessments.  There is a longterm aim to add WFS capabilities to SEWeb.

One of the sites that feeds data into SEWeb is Scotland’s Soils, run by the James Hutton Institute.  The soil map is based on the 1984 1:250,000 mapping and has 580 different mapping units although the web map uses a simplified unit scheme. You can also access the data through an iPhone app which gives you access to the soil structure at over 600 points across Scotland.

It is great to see this data being made available, but I can see the “ugly” issues mentioned by Dave and Duncan.  Just move from Scotland’s Environment to Scotland’s Soils and the maps are very different. From a usability side of things the map controls are completely different.  We, as GIS professionals, have no problem knowing how to use either. They are intuitive to us, but we are experts. The average member of the public may well struggle. Imagine if they finally learn to use 1 map interface then find that the map on the other site is completely different. Not ideal. The solution would be to develop a consistent interface and share the code. However, this would mean that all partners would have to agree to use the same libraries to build their web maps.

Other highlights from the event included Astun Technologies Mike Saunt who talked about “Doing something with this Open Stuff”.  Mike showed how local government was making data available, and importantly, accessible. Councils could then share data feeds automatically therefore saving time and money. However, Mike highlighted some of the problems that arose when making data open with examples where url’s did not resolve because of typo’s. More worryingly was an organization that was promoting it’s WMS but was also serving a WFS. The organization was not promoting or linking to the WFS and Mike suggested that they may not be aware that they were serving the WFS.  The solution is to ensure you understand what you are making available and why. If you don’t have the skills in house then get someone in to ensure everything is set up correctly.  This is kind of what Astun do and using services like theirs is a cost effective way of working.

Another talk that really shone was Crispin Hoult from Link Node. Crispin introduced the concept of GIality which is the use of geospatial data in augmented reality. This makes a lot of sense. You have a location aware device with a host of sensors in it and can use this to visualize changes to a landscape while you are actually in that environment. This semi-immersive technology would certainly help the visualization of developments like windfarms or new housing estates and takes us beyond the “comfortable” use of overlays on paper maps.

The day finished up with Anne Kemp talking about the future of AGI Scotland and the strengthening community of GIS professionals in Scotland.  There was mention of the Chartered Geographer in GIS qualification but it was pointed out that to become chartered you had to join the Royal Geographic Society (RGS) which did not have a remit in Scotland. Anne noted this and said she would look into it.  She also mentioned other recognised professional qualifications such as the Royal Institute of Chartered Surveyors (RICS) who offer a GIS orientated qualification. Will be interesting to see how the CGeog GIS issue progresses this year.  It does seem the best suited but is not perfect if you are living and working in Scotland.

Old Maps online workshop

Old maps online launched some months back and has been quite a hit.  It essentially is a catalogue of old maps from library collections around the World.  However, it is much more than just that. Old maps online allows users to make spatial searches for maps rather than having to rely on fields such as Title, author and published date.  This is not the information that springs to mind when you want a map.  Place-names, regions and coordinates are more logical search terms.

As part of the Old Maps Online project, the  team are putting on workshops and i attended the Edinburgh event on Thursday 13th December. Edinburgh is steeped in mapping history and has one of the largest map library collections in the World.  Whats more, a significant percentage of the National Library of Scotland’s collection has been scanned and made available online for free.  The NLS have recently updated their catalogue interface and it is even easier to search and view maps.  This is a huge resource and has sparked the interest in many researchers who have utalised the old maps in their research.

The NLS site is uses software from Klokan Technologies, a small Swiss company run by Petr Pridal. Petr has put a lot of effort into improving the searching and discovery of historic maps online and it was for this contribution that he received the Bartholomew’s Globe. The Bartholomew’s Globe is an award from the Royal Scottish Geographical Society (RSGS) and is awarded in recognition of an exceptional contribution to cartography, mapping and related techniques in Scotland over a long period of years. The award was presented by Bruce Gittings, RSGS Vice Chairman.

Bartholomew’s Award 2012

The rest of the event focused on how historic maps, and historic geographical data in general, were being used in researchers. The flavor was, as expected given the location, generally Scottish, but it also brought together a mix of academic researchers, commercial organisations and enthusiastic amateurs.  Presentations that stood out included:

Alice Heywood (NLS) who described a project that got School children to develop content for mobile apps that provided historic tours of their home towns. The pilot had been run in Elgin and the children had produced some excellent narratives explaining their local historical sites and traditions.  This kind of partnership between the NLS and schools seems like an excellent initiative. Perhaps it could link in with organisations such as VisitScotland to create apps for tourists visiting Scotland. More information about the Great Escapes project can be found on the NLS website.

Chris Speed (University of Edinburgh) who discussed the “blue dot” concept.  This is really that a mobile device will represent your position as a blue dot, but using historic maps and data you can allow the user to travel back through time at a particular location. Chris has had publicity with his Walking through time app, a project which was supported by JISC and EDINA. This allowed users to view historic maps of Edinburgh and embark on guided tours through history via their mobile phones. Chris want to expand this to Glasgow, arguably a more dynamic environment which might reveal more startling change to users. I am not sure I entirely agree with Chris’ comments about connecting with individual objects such as trees which have persisted in green spaces while the build environment has changed around them.  Trees on maps tn to be representative rather than an absolute record.  However, if you are in a greenspace and faced by a tree that is clearly over 100 years old and trees are marked on the map you can believe that the surveyor stood there and added it to the map all those years ago and that tree is a link to the past environment.

There were 2 talks on mapping old transport links.  David Simpson had tried to locate roads marked on Roy’s Maps, Roy’s Roads. David found that bridges were quite reliable features of Roy’s maps and by locating these on the ground and modern maps you could then find the old road features. Many of these bridges are being lost, used only by farmers to access fields but represent an important part of Scotland’s history.  Neil Ramsay (Scotways) was working to display old path networks on modern maps.  Discovering old routes and posting them online is one way in which Scotways in encouraging people to get out and discover their local area. It was noted by a member of the audience that there was an apparent lack of paths connecting Glasgow and Edinburgh. Neil noted this and mentioned that it was certainly on the list of places to investigate, perhaps enthusiastic walkers could lend a hand.  Just go to the NLS maps page and scan through the maps to see if a path exists in your local area that is missing from the modern OS maps, then get out an see if it exists on the ground. Take a look at Scotways excellent Heritage Paths site.

There was a very interesting presentation on using Historic maps as a tool for place-name research given by Jake King (Ainmean Aite na h-Alba). Jake had used the NLS historic maps to investigate the changes in spelling of Gaelic place-names through time.

Bomb Sight

Bomb Sight

Humphrey Southall and Andrew James (The National Archives) deputised for Kate (bomber) Jones (University of Portsmouth) who was unable to travel to the event.  The Bomb Sight project maps the bombs that fell on London during the first phase of the blitz. This project digitised and mapped records held by the National Archive. These maps were previously only available for the public to view in the reading room at the National Archive.  Users can view the location of bombs and display attribute data such as the date, bomb type and, in most cases, view “nearby memories” such as audio and pictures from the archive. Users can switch between the modern map and the 1940 Bomb maps. These maps are a bit grainy and it would be great to see some crisper historic mapping in there.  The Bomb Sight project also has a mobile app that allows users an augmented reality view of the blitz. The project has done incredibly well and attracted a lot of publicity. This demonstrates the power of fusing historic maps with archived data that has never been displayed digitally.

This really summed up the event.  There is public interest in historic data and making it accessible in a digital format is the key.  Once those interested in historic data can get their hands on the digital data, they can turn it into useful information that others can enjoy or even re-appoint for other uses such as education and tourism.

 

How highs the water mama? 3 feet high and rising

A great line from De La Soul, but on a serious note, sea level is rising and scientists don’t seem confident that the trend is going to slow or reverse anytime soon.  Over the past month there has been a constant stream of news articles reporting on record low-levels of sea-ice at the end of the melt season and ships short-cutting through the Arctic to reach the Pacific.

It would seem that the writing is on the wall for Arctic sea-ice.  The change in albedo alone will only accelerate the decline in ice extent.  The exposed ocean will absorb more heat from direct solar radiation than the sea ice which reflects a lot of the radiation and as the ocean warms, the sea ice decline will accelerate.

The accelerated melting of the Arctic sea-ice is significant, but at least we know that this will not result in significant increase in global sea levels (beyond the general expansion of water as it is heated). Those worried about global sea levels are really focused on the large ice sheets of Greenland and Antarctica which, if melted completely, would contribute 7m and 61m respectively.  68m of sea level rise.  That is huge.  But what would it look like?  Well there is a very neat interactive map that allows you to visualise future sea levels. Go on, scare yourself.  Look how recognisable coast lines change as sea levels increase.  You can wipe The Netherlands and Denmark off the map. Click the image below for an overview.

Link to Sea-level Map

Link to ESA article on Polar Ice Loss

 

Social Networks and Personal Digital Archives – it’s up to you.

Gary Gale, Director of Places at Nokia, was a keenly anticipated speaker on this years EEO-AGI seminar series.  Gary has many years experience in the world of tech and GIS having worked for Yahoo for many years before switching to Nokia in 2010.  Gary is also a self confessed geek who loves to dabble with tech.  He is even happier if that tech includes some sort of location element.

Even on the evening of the presentation, the title was still TBC and it came as a bit of a surprise that the presentation did not focus on maps or what Nokia was up to.  This may have been a little disappointing for some in the audience who were keen to discover how Nokia was going, or planned, to keep up with the battle of the maps between Google and, as referred to on the night, “that fruit based” handset maker.

Instead Gary delivered a  talk on the spiraling generation of data from social applications and explored the issue of who actually own the data.  On the issue of who own the data, he was very clear. As soon as you hit submit you are essentially uploading a copy of your original data be that a photo submitted to Flickr, Facebook or Instogram, a location uploaded to 4Square or Facebook or simply 140 characters taken from your head and submitted to Twitter.  The copy of the data is then the property of the social media network that you have uploaded it to.  Ok, you are the content creator but you have passed the data to the network.  Say it is a photo, you still own the rights to the original which should be on your phone or computer, but you have licenced the data to Flickr under the T&C’s of their service agreement.

Customer or Product?

One way to look at it is in terms of products and customers.  If you pay for a service then you are usually considered a customer.  If you get something for free then are you still a customer?  Probably not.  In the world of social networks you are most likely to be the product. Social networks are not there just to let us connect with friends, they are there to make money and that means selling data and information about their users to companies that want to sell things to us.  Flickr is a trickier one to unravel as it uses a “freemium” model.  You can use the service for free, but for a small fee ($25/year) you can get additional feature such unlimited storage for your photos and better access to your stored data. Certainly organisations, such as GoGeo’s home EDINA, would be interested in a persistent record of activity on social networks such as Twitter and would be willing to pay for a service that does it.

But where is this all going? Well, if you are an active user of social networks then you have probably documented a good part of the last 2-5 years your life in them. This record probably means something to you.  We used to print out our photographs taken on film cameras and pop them in an album that we would then show to friends when they came round for a cup of tea (how very British). Then we moved to digital cameras and printed out fewer images but could still print the special ones or organise them in an album on our computers.  Harder to pass around a group, but our friends would perhaps still pop round to see them.  Now we, according to the upload figures from both Flickr and Facebook, seem to take more pictures on our mobile phones.  Many of us then upload these to a social network which is handy as we seem to lose or break or phones with alarming frequency. And it is this that Gary is worried about.  How can you get the data that you have submitted to a social network back when the worst happens, you loose your own, original copy?  In addition, we would be naive to think that any of these social networks could be considered permanent.  They have grown massively over the last 5 years, but for every one that reaches perceived maturity, many more fail.  Gary asked how many of us remembered Gowalla.  Only a couple of arms were raised.  MySpace?  Once the next big thing, now mainly the preserve of bands. So how can you ensure that you have a back-up of all the data and information you submit to these networks?

Most of the social networks have API’s which allow you to make data requests. However, to access many of them you need to have the user ID and this is not always easy to access. Flickr, Facebook, Foursquare and Instagram all allow access to all submitted data through their API’s.  Twitter does not.  If you judge the ability to retrieve everything you contribute as minimum competence then Twitter fails.  Actually, Facebook would fail, or partially fail as  you cannot get everything out, just the recent (past 2-3 years) content.

Personal Digital Archives (PPA’s)

Fortunately there are some clever folk out there who are also concerned about this and have built applications that will archive your stuff for you and keep it safe.  Importantly, this copy will be independent of the social network it was originally posted to. These services are being referred to as Personal Digital Archives (PPA’s). Furthermore the nice developer chaps and chappesses are distributing their code through GitHub, making it publicly available.  Grab it and use it, or grab it and adapt it. Even better, grab it, adapt it and then share it. Examples of these PPA’s include:

Parallel-ogram – a web app which archives your Instagram photos and likes to make it far easier to look back at them later. The app uses Instagram’s API to monitor all of your activity on the site, both private and public, and creates your own personal photo stream.  The app isn’t hosted, you will need to install it on your own computer or a web server.

Parallel-flickr – parallel-Flickr is a tool for backing up your Flickr photos and generating a database backed website that honours the viewing permissions you’ve chosen on Flickr.

Privatesquare – privatesquare is a simple web application to record and manage a private database of foursquare check-ins.  Check-ins can be sent on to Foursquare (and again re-broadcast to Twitter, etc. or to your followers or just “off the grid”) but the important part is: They don’t have to be. You still have the record of where you were and when.

Twitter – Twitter is a bit different, no surprise there.  What you can do is set up a RSS feed from your account and then pass this into something like Google Reader.  Then archive the tweets from there. You can also do this from lists that you set up allowing you to archive tweets from other users.

Big Brother?

So why might you want to do this?  Well some people are very interested in archiving their digital history as it provides memories and triggers for memories.  But Gary highlighted how digital data is being used by government and how it can be used to prove that perhaps you were not somewhere at a particular time.  Gary gave an example of a friend who was a suspect in a nasty assault in London.  He had become a suspect because he happened to swipe into a tube station at about the right time late one evening.  Fortunately he was able to use other digital information to show where he had been, what he had been doing and what he did next.  This is not an isolated case and it seems that the authorities use digital data as fact and get you to prove or disprove it. Is this a change from innocent until proven guilty?  Certainly some of the digital data would be dismissed as coincidental or just plain wrong but the onus is on the individual to provide an alibi.

So, overall I think this was a very good talk that explored many interesting aspects of social media data and ownership.  If you want to find out more about what Gary has been upto then he can be found on twitter as @vicchi and has an active blog.  This presentation and some text to go with it can be found here.

The next EEO-AGI seminar will be held on the 23rd November and will be on Humanising Archaeological GIS and will be presented by Prof. Gary Lock.

————————

Other cool things to look at:

Yourls – YOURLS is a small set of PHP scripts that will allow you to run your own URL shortening service (a la TinyURL). You can make it private or public, you can pick custom keyword URLs, it comes with its own API. You will love it.

Donottrack – Do Not Track is a technology and policy proposal that enables users to opt out of tracking by websites they do not visit, including analytics services, advertising networks, and social platforms. At present few of these third parties offer a reliable tracking opt out, and tools for blocking them are neither user-friendly nor comprehensive. Much like the popular Do Not Call registry, Do Not Track provides users with a single, simple, persistent choice to opt out of third-party web tracking.

 

As Hurricane Sandy batters the Eastern Seaboard

booom!

Hurricane Sandy – NASA image acquired October 28, 2012

Hurricane Sandy is currently doing it’s best, or worst, to disrupt Halloween  on the Eastern Seaboard of the USA.   Sandy is a vast storm, currently a swirling mass some 800km across and is trundling north up the coast.  Importantly, the wind speeds are increasing having risen from 120km/h (75mph) to over 140km/h (85mph).  Officially, Sandy is still a category 1 hurricane and the US has seen much stronger but it is the path of Sandy that is causing concern.  While he has not intensified in the same way as hurricanes that feed off the warm waters of the Gulf of Mexico, he is moving towards densely populated states on the eastern seaboard.  The US government has, rightly, taken necessary precautions ordering thousands from their homes and closing schools. The question is, where will Sandy make landfall?

For anyone interested in finding out more about Hurricane Sandy GoGeo has compiled some resources.  For those interested in hurricanes and tropical storms in general there are some really useful links as well. As always, if you find anything useful which is not on the list, just add a comment and I will append it to the list.

Hurricane Sandy:

 

  • NOAA National Huricane Centre – The main source for hurricane information in the USA with the latest projected paths and forecasts updated regularly.
  • BBC Info page – great little collection of news, pictures and live updates from the Beeb
  • Wikipedia – as usual, a plethora of links and information which will grow during the event and provide an archive to material afterwards.
  • Google crisis map – a crisis map from Google with information about medical and evacuation centres in the areas likely to be affected. It also links to weather feed to display radar, cloud and predicted storm tracks.
  • ESRI Public Information map – showing similar data to the google map but in addition seems to drag in tweets and YouTube clips where it can.
  • Wind Map – A nice info-graphic showing the live wind vectors across America. (This is a live feed so i grabbed a map from 0500 hrs on 30th October 2012, just after the storm hit land)
  • ESA – data collected by ESA satellites including 3D structure of the hurricane from CloudSAT. Masses of images and data here.

Useful Hurricane Links

Some more links to useful hurricane and tropical storm resources.

  • ShareGeo – A subset of the data from the Historical Hurricane Center that shows all storms since the year 2000.
  • Historical Hurricane Tracks – This is a service run by NOAA and it provides a host of information about historic hurricanes.  You can download the data they hold by year, by oceanic basin or by name.  Data is provided in a variety of data formats including CSV, NetCDF and shapefile.

All major storms since 2000

River Data

flood

Flooding in Morpeth – 25/09/2012 – (Courtesy of Johndal – http://www.flickr.com/photos/johndal/)

It is summer here in the UK, or at least it is meant to be. Summer 2012 seems to have been a bit of a wash out and there have been a number of small-medium flood events across the country.  This has prompted me to collate a list of data sources related to rainfall, river flow and flooding for the UK and beyond.

River Level

  • Scottish River levels from SEPA – SEPA provide real-time data for a number of major rivers. (data feed available)
  • England and Wales – The Environment Agency supply information on river levels but no data feed at the current time.
  • USA Water data – The USGS suply live feeds from a huge number of rivers across the US.  You can drill down by state to see individual rivers. (Data feed available)
  • ECRINS – ECRINS is acronym for European catchments and RIvers network System. it is a fully connected system of watersheds, rivers, lakes, monitoring stations, dams made from the JRC CCM2.1 and many other sources.

In addition to river data, hydrology usually requires an understanding of the weather and the climate of an area.  Below are a selection of resources which provide meteorlogical datasets.

Rain fall Data

  • Met office Historc station data – Exactly what is says on the tin, historic data from the Met office.
  • Data.gov– Some met office historic data is also available through the data.gov portal.  Historic measurements form around 20 observation stations and is updated each month.
  • floodwarn.co.uk – not an official EA/SEPA site, however it does contain links to many live met station data feeds.  Links are accessible through a Google map window which makes it easy to search through data.  Floodwarn also provides feed about flood warnings and drought orders.

Flood Alerts

Flood Warnings and Alerts issued by the Environment Agency for England and Wales and by SEPA for Scotland.  These inform the public of increased risk of flooding and should help them prepare for flooding. Alerts are seperated into 3 categories;

  1. Flood Alerts – Flooding is possible. Be prepared.
  2. Flood Warnings – Flooding is expected. Immediate action required
  3. Severe Flood Warnings – Severe flooding. Danger to life.

The following sites are quite interesting when you are looking at flood alerts.

  • Environment Agency – Shows the flood warnings that are currently in force.
  • SEPA – Shows the flood warnings that are currently in force in Scotland
  • Shoothill Flood – a nice map that shows the flood warnings, takes the data feed from the EA and SEPA API’s.  Easier to see where the flood alerts are than scanning a big table. The thing i like about this site is that it shows the river reach that is affected.  Would be great to see them add Scotland!

Met Data

  • BADC – the British Atmospheric Data Centre has numerous free datasets available.  One such offering is the Met Office Integrated Data Archive System (MIDAS) Land Surface Stations.  This contains numerous different weather observations such as wind, air temp, rainfall and sunshine.
  • WorldClim – WorldClim is a set of global climate layers (climate grids) with a spatial resolution of about 1 square kilometer.  Information about the methods used to generate the climate layers.

ShareGeo

ShareGeo has a number of useful, free datasets for anyone wanting to do anything with rivers.

  • GB Rivers – this vector dataset shows the location of the main rivers in Great Britain
  • River Flow Gauges GB – shows the position of flow gauges on Great Britains rivers. These are used to monitor water levels and flow.
  • European River Data – Main rivers of Europe
  • EEA Hydrographic Data – ECRINS is a fully connected system of watersheds, rivers, lakes, monitoring stations, dams

2012 November Floods in England

This section concentrates on the flooding that occured in England in late November 2012.  A series of depressions tracked slowly over the south-west of England and over a months worth of rain fell in a day. This fell onto already saturated ground and caused widespread flooding.

  • Guardian reader photographs: some great pictures showing the flooding in England.
  • Met Office Radar: a video showing the rainfall radar from Saturday 24th through to Monday 26th November.  Really visualises the size of the storm and the intensity of the rainfall.
  • BBC Drone video – BBC video shot from a drone. Neat use of tech.
  • BBC Overview – BBC report from 26th Nov with lot of useful links to regional stories

Pearltrees – wisdom on the net?

The internet is full of information, not all of it is exactly what you would call useful.  However, sometimes you stumble upon something that is actually useful.  Today is one of those days.

Heard of Pearltrees?  Neither had I.  If seems to be a social media site that allows users to organise stuff that they have found on the internet.  This is nothing new, remember Delicious?  The neat thing with Pearltrees is that it looks like it is using Linked Data to tie “things” together making it easier to explore resources and make links between subject areas.

So what, Linked data, yawn.  Well I was lucky enough to stumble upon a user by the name of drbazuk who has collated a host of useful GI links on Pearltrees.  I have had a quick look at some of them and have bookmarked his page to delve into tomorrow.  I suppose if i was sensible, i would be signing up and collating my own set of resources for GoGeo.  Before i do this, i suppose i should have a bit of read about exactly what Pearltrees is all about.

OSGIS 2012 – Day 2

OSGeo_logo

The second day of OSGIS 2012 saw a full day of short paper presentations and a couple of workshops.  The day started with a keynote from Prof. David Martin, University of Southampton.  David is  Director of the ESRC Census Programme and his talk looked at the data that will come out of the 2011 census. It also discussed the future of census programs in the UK.  The take-away points for David’s talk included:

  • Lots of new fields such as “do you intend to remain in UK?”
  • 16th July 2012 – age/sex distribution LADs released
  • Nov 2012 – release to the OA level which will be of interest for Geographers
  • Spring 2013 – multivariate stats and some new stuff like time dependant location data which will be interesting for disaster management/response and answering questions such as “who is where/when?”
  • Access to longitudinal data and data about individuals will still be restricted to secure labs

David made some interesting points including crediting the CDU in Manchester for making the census data far easier to access and analyse.  The data is in excel format and has the crucial area codes which we geographers love.  

He showed some analysis of work place zones which modifies the census units based on where people are during the day (work place) which should make disaster planning more efficient. It was also noted, light-heartedly, that this could be used to determine where to locate your burger van during the week.  

Next up was Ian James, Technical Architect for the Ordnance Survey. Ian’s presentation was on how the OS was embracing the open source opportunity.  The OS now use open source solutions for internal activity and client-facing interfaces.  It took a while to convince the whole organisation that open source solutions were more than capable of handling large and valuable datasets.  It is now clear that some open source solutions are in fact better than their proprietary counterparts.  However, Ian stressed that open source was not free.  There is always a cost associated with software, with open source solutions there is no up-front licence fee, but there is cost associated with training users and administrators or buying 3rd party support.

After coffee, the conference split into parallel strands, I switched rooms to catch certain presentations and my write up will reflect this.  You should be able to watch the presentations on the OSGIS 2012 website.

Matt Walker, Astun Technology demonstrated the open source system Loader, a simple GML loader written in Python that makes use of OGR 1.8.   Matt showed us how Astun were providing TMS/WMS for various clients and how they managed to run it all through Amazon web services.  Top tips from Matt included:

  • Amazon web services are great, you can even have fail-over instances, but be sure to manage your system or risk running up bills quite quickly
  • Use PGDump to increase postgres load times (4x quicker)
  • MapProxy rocks
  • UbuntuGIS makes life easy
Next up was Fernando Gonzalez who presented the possibilities of Collaborative geoprocessing with GGL2.  GGL2 is an evolution of GGL which was a scripting application for GIS.  GGL2 makes scripts much simpler, fewer lines of code makes it easier us humans to read.  GGL2 is available as a plugin for gvSIG and QGIS.  If you want to find out more about GGL2 the look at gearscape.org
EDINA’s Sandy Buchanan gave a demonstration of Cartogrammer which is an online cartogrammer application. It allows users to upload shapefiles and KML files and then create cartograms.  This is very neat and really does remove the technical barrier in producing interesting info-graphics.  The service makes us of ScapeToad and is available as an online service, a widget and an api which can be called from your own website.  We will let you know when it goes live.
Anthony Scott of Sustain gave an excellent presentation on the work he has been doing for MapAction.  If you don’t know what MapAction is or what they do, they provide mapping and GIS services areas that have suffered natural and humanitarian disasters.  Infrastructure is important if aid is to be delivered and this requires knowledge of the what is on the ground at the time, and in some cases, what is left. Take 5 minutes to look at their website and if it sounds like something you would like to support, hit the big red donate button.
Jo Cook, Astun Technology, looked at how you might use open source software and open data to do something useful.  She looked at taking GeoRSS feeds from sites such as NHS Choices and PoliceUK to extract location specific information, link it with other open data and then make this publicly available. According to Jo, you can do quite a lot with very basic python scripting. The last slide of Jo’s presentation has a list of useful resources, seek it out when it is made available on the OSGIS website.
The best presentation prize went to Ken Arroyo Ohori, TU, Delft. Ken demonstrated some code that he had written which fixed overlapping and topologically incorrect polygons.  PPREPAIR looks brilliant and is available in GitHub.  Ken plans to make it into a QGIS plugin when he has time, i think this will be really useful.  Nice aspects include being able to set a “trusted” polygon class which would be assumed to be correct if two polygons intersected.  Ken demonstrated ppgrepair’s capabilities fixing polygons along the Spanish/Portuguese Border. Because two mapping agencies have mapped the border independently, when you combine the two datasets you get horrible overlaps. Ken’s presentation was clear and informative and his ppreapir really does look useful.
The event finished with Steve Feldman of KnowWhere Consulting.  Steve has been working in GIS for many years, but is, by his own admission, not a techie.  He approaches the subject with a business hat on and it is useful to hear this perspective.   Steve reiterated the point that Open Source was not Free software.  It is commercial software with no massive up front lumps sum and no long term contract. You can pay for implementation and support.  You can fund developments that you want, rather than functionality you dont need. Steve suggested that the “Free” was a confusing term, but a member of the audience suggested that Free also related to not being tied to a contract or service provider.  You can opt in and out as you wish.
Feldman

FOSS4G 2013

Steve then took the opportunity to officially launch FOSS4G 2013, which will be held in Nottingham in September next year.  This event will be huge and is definitely one to put in the calendar now and make sure you get along to it.  There will be over 500 delegates from around the world all focused on doing more with open source geospatial tools.  In fact, better than that, volunteer to help at the event.  The local organising committee needs extra people to help make FOSS4G 2013 a success. If you want to help, pledge your support on the pledge page and someone from the loc will get back to you.
 So, another great event.  Thanks to Suchith, Jeremy and their team for making it happen.  OSGIS will not happen in 2013, but FOSS4G will more than make up for it.

 

OSGIS 2012 – Part 1

OSGIS is now in it’s 4th year and has really become one of the main events that brings together users and developers of open source geospatial tools.  The nice thing about OSGIS is that it attracts an even spread of delegates from the commercial, the public and the academic sector.  This cross-sector mixing is, in my opinion, very healthy for the geospatial sector.

Jeremy Morley at OSGIS 2012

Day One of OSGIS 2012 featured workshops where users could get hands on experience of software under the guidance of expert tutors.  The morning session saw an introduction to GeoNetwork, a geospatial catalog service, and and overview of the OSM-GBproject which has made in-roads in topologically correcting OSM data. These workshops are integral to the ethos of OSGIS as they are designed to empower both novices exploring the potential of open source software and the expert users looking to refine their skillset and discuss technical problems.

After lunch I opted to attend the session looking at the educational use of OSGeo Live. For those of you that have not heard of OSGeo Live, it is a bootable DVD which allows you to investigate OSGeo software without having to instal and configure it on your own computer. This is an excellent way to explore the functionality offered by the numerous packages such as uDIG, QGIS, Openlayers and GeoServer.

Barend Kobben of ITC in the Netherlands outlined how OSGeo Live was used in teaching and why it solved many issues.  Increasingly universities are assuming that students will want to use their own laptops rather than relying on open access labs.  This means that the tutors have no control of what computer students will use to complete course work. Supporting multiple operating systems and system configurations is virtually impossible.  Using OSGeo Live removes the necessity to configure systems. Just put the DVD in the computer,reboot and go. Well, most of the time.  Not all computers are set to boot from the DVD drive, users would have to access their BIOS to set their boot sequence. Running the OSGeo Live from a USB stick or on a virtual machine potentially reduces the hassle of dealing with boot sequences.

Jeremy Morley of Nottingham Geospatial Institute echoed Barend’s experiences.  Jeremy had used Oracle VirtualBox and then taken snapshots on a Storage Area Network (SAN) to ensure that students work was backed up.  This looked promising but didn’t scale when 20+ students tried to access the SAN. Unfortunately, the snapshots were tied to a single machine ID, students would have to use the same machine eachtime they accessed their work.  This was not an acceptable solution. Jeremy switched to running OSGeo live from a USBstick and this was an improvement, but again, was not without it’s own issues.  The FAT32 format reduced the usable space on the 8 Gb drives to just under 5Gb and cheaper USB sticks were prone to burning out and failing.  But, the solution was acceptable and Jeremy was able to deliver the course to to the students. Next years course will be refined in light of discovering these issues.

As an aside, Jeremy flagged the potential need for more Geographic Information Systems courses to support the wide and varied technical applications which require in depth knowledge of computing.  There has been a trend of Geographic Information Science courses over recent years where students are taught how to apply GIS to solve scientific problems.  However the maintenance of systems and interfaces which allow data to be published and interacted with is important but forms the base of only a handful of course at the moment.

During the discussion of these two papers, it ws suggested that a Cloud Space to run GIS would be useful, if you could configure what tools you wanted.  A figure of 4Gb was suggested as a reasonable workspace.  This would allow users to analyse data but would have to carefully manage their space.  You could always “do with more space” but you could teach with about 4Gb of space.

The first day closed with a presentation by Jiri Kadlec of Aalto University, Finland. Jiri, by his own admission , was new to open source GIS and set himself the challenge of managing and translating data in differing coordinate systems. Projections and coordinate systems are always a challenge.  The theory is that you should be able to get from any “system” o any other “system” by passing through WGS1984.  Jiri found QGIS to be the bet of the bunch but it was not perfect.  Juiri also put together a neat little projection comparison tool which many of the audience thought would be an excellent aid when teaching students about projections, or for showing representations of land areas in different projections.

The day finished with a drinks reception and a visit to some of the sights of Nottingham. Fortunately, some of the best historic sites just so happen to be pubs and the Jubilee Campus is the site of the old Raleigh bicycle factory.

Historic Site

 

Cartography and the Tour de France

For those of you that know the person behind the GoGeo blog and @go_geo, then you will know that they enjoy a spot of cycling. That may be a bit of an understatement.  So, imagine their delight when they stumbled across the following short video while frantically trying to find the highlights for stage 4 of the Vuelta a Espana.

The Tour de France is a complicated event to put on. 21 days of racing with a temporary village to build for each start and finish. Maps have to be made for the cyclists that show the route, the distances and the gradients on the road.  Additional information such as tight bends and “road furniture” such as roundabouts and traffic islands are added to the maps to help riders navigate safely through the stage.  The maps also help the events team to set up the stage finish, identifying structures that have to be moved or modified to ensure safe racing or for the support infrastructure.

Click to play – Tour de Cartography

Excellent stuff, the only thing I need to find out now is how to get the job for myself.