Why attending commercial events?

I have already written a review of GeoBusiness 2014.  But I thought it was worth writing something about the nature of the event and the interaction between different sectors.

What is GeoBusiness and why was it different?

GeoBusiness is a new event.  As the name suggests, its focus is very much on businesses.  The event had 3 sides to it:

  1. Exhibition –  hardware, software and solutions from the UK sector
  2. Exhibitor workshops – exhibitors ha the chance to run workshop sessions to showcase their products
  3. Formal conference – talks by GI practitioners covering best practices and discussing the next big things in the GI sector

Why was this significant?  Well it gave attendees 3 options and allowed them to mix and match.  Not everyone is interested in listening to formal presentations, while others most certainly are.  This is, in my opinion, the key to attracting GI users from different sectors.  Once you have them all in the same place, interaction will happen. Especially if you timetable in plenty of mingling time.   At Geobusiness we saw companies that were selling the hardware to collect data, data collectors and data consumers all mixing and exchanging experiences and knowledge.

It also brought users from right across the sector together in one place.  Those that design the kit to collect data, the data collectors and the data consumers were all well represented and they had a heavily discounted rate for students.

Why should students attend these event?

For any student with GIS skills this event really was a golden opportunity to scout out potential employers.  OK, you can do some of this on the net, but rocking up to a stand and having a chat with people from the company can give you a much better insight into the organisation.  I am not talking about simply asking them if they have any jobs available, a better approach may be to ask them about recent projects or the tech that they use.  You should then be able to enquire about graduate programs or mention things from your course that are related to what they do.  This is networking.  Some people are really good at it, others just don’t feel comfortable.  The key to it is making sure that the person you are networking does most of the talking.  This takes the pressure off you and usually makes them feel like the chat went well.

Following up 

I am not sure i would recommend giving out CVs at an event.  Most people come away with a heap of paper which rarely gets looked at again Your CV may well get lost.  A better approach might be to take a business card from the person you have chatted to and send them a brief email a few days later (not that evening, their inbox will be stuffed with missed emails that have accumulated while they have been out the office).  Remind them who you are and that you think that the company sounds like one you would want to work for and ask their advice on how to apply.  It is worth checking the current vacancies page first for information about graduate jobs and current vacancies.

If you don’t have a named contact, then get a CV and covering letter together that match your skills to the companies work and send them off.  I would mention in the covering letter that you visited the company’s stand at a recent event.

Conclusions

Attending events can seem like a jolly, and i suppose they can be. But they are important events that bring lots of like-minded professionals together in the same place.  For an graduate, or an early career professional, such events are a gold mine of potential contact and even future employers. However, you get out what you put it. Be prepared and do your homework.

GeoBusiness 2014 – review

GeoBusinessA couple of weeks back I attended the first GeoBusiness conference in London.  It was an interesting event and I have been meaning to write up my thoughts on it but keep getting snowed under with last minute jobs.  I have finally managed to clear some time and can report back to you all what happened at the event.

I decided to go to the conference to see what the public and commercial sectors were working on and what they thought should be the current focus for the GI sector.  Neil Ackroyd, the Acting Director General and Chief Executive of the Ordnance Survey opened proceedings by summarising the view of the sector from the main data provider’s perspective.  Condensing his talk to a few key points I would say the OS were focusing on networks (in terms of geographical networks such as rivers, railways and paths) and collaboration.  They are increasingly working directly with organisations to deliver bespoke data that can be used to support large building infrastructure projects, or for events such as the Olympics.  The OS are currently working on hosting data in the cloud, essentially having unstructured data that is accessible to users.  Storing the data as “unstructured� means that you can apply structure as it is accessed and tailor this to the clients needs.  The advantage is that you have one definitive source rather than multiple versions that are subtly different but which al require maintaining.  Neil closed with two take-away thoughts:

  1. Know your market
  2. Simplify things for them

After a short coffee break I attended the Making data Deliverable Strand.  The first talk of the session was given by Paul Hart (Black & Veatch) who discussed the use of GIS visualisations to convey complex information to the public. The examples centred around flood alleviation schemes where different scenarios and their resulting benefits, could be presented in an interactive way.  The use of 3D views that used true colour aerial image back drops allowed non-geo experts to engage with the data.  The output summarised several hundred model scenario runs in an easy to digest way. I did have a couple of issues with the visalisation, the first being the use of red and green which, while intuitive in terms of good/bad,  would not be particularly colour blind friendly.  The visualisation didn’t really convey uncertainty.  Including uncertainty would possibly complicate the visualisation, but the public may incorrectly assume that the flood outlines were accurate rather than the best estimate from modelling.  I questioned Paul about this and he explained that the maps were presented to a closed audience with experts on-hand to explain them.  He agreed that displaying uncertainty on such maps could over-complicate them.

This was followed by another talk focused on visualising data. Lingli Zhu from the National Land Survey of Finland demonstrated the work they had been doing to visualise landscapes using the Unity game engine.   Unity has been used in popular games such as Gut and Glory, but can be easily adapted to produce realistic simulations and can help users visualise environment change.  However, Unity does not allow user to specify a real-world geographic reference frame which means any geographic data has to be shoe-horned into the virtual world.

The second part of the session focused on BIMs.  BIMs (Building Information Modeling) have been the subject of several events over the past couple of years and they seem to make sense, but they seem to span

First up was David Philip, Head of BIM Implementation at the Cabinet Office.  David gave a great overview of BIM implementation with a presentation that was peppered with light humour.  David detailed the “3 tribes� living in the BIM World: CAD users ?GIS users and BIM users.  BIMs should be an open, shareable asset that unites CAD and GIS users. David pointed out the importance of BIMs throughout the life of a building as the cost of building (capex) is much smaller than the cost of running or operating (opex) a building.  Therefore, the BIM is a critical tool in maximising the efficiency of a building throughout its lifecycle and should aim to be an “open shareable asset information system�.

BIM

BIM Task Group

David closed by pointing out that we often suffer from “Infobesity� and we should better understand which data we need to retain and which we can get rid of.  Keeping everything is just not a sustainable approach.

The second two presentations in this session provided insight into actually implementing BIMs in the commercial sector.  Peter Folwell (Plowman Craven), Matthew McCarter (London Underground) and Casey Rutland (Arup) gave honest opinions of the highs and lows of working with BIMs.  The consensus from these presentations was to implement a BIM early rather than as an after-thought that ticks a box. Setting up a BIM early will allow the project to reap the benefits in terms of organisation, data flow and cost savings.  Also, 3D scanning seemed to be seemed to be at the heart of the BIM but this should not be seen as a one-off task, regular scanning can help partners visualise the evolution of a project and help identify potential issues.  However, multiple scans need not man multiple BIMs, just add them to the existing BIM.  One aspect that surprised me was the strength of the BIM community on social media.  There seems to be an active community lurking in Twitter that are happy to share best practise and offer general advice.  Just search for hashtags such as : #ukbimcrew / #laserscanning / #pointclouds.  If you want to find out more about BIMs then look at the BIM Task Group website

After lunch I attended the Global Trends session which had a wide range of talks from legal issues surrounding geospatial data, to downstream service opportunities from remote sensing data.  Ingo Baumann discussed the legal constraints surrounding geospatial data, focusing particularly on open data licences and issues around personal data.  One of the key problems is a lack of consistency between countries.  Google has discovered this publically while rolling out StreetView across Europe.  There is no specific geospatial law, but it is coming.  Until then, I will be keeping an eye on useful blogs such as Spatial Law and Policy.

Carla Filotico (SPRL) highlighted the value of remote sensing data and the downstream service opportunities.  The Argi business could benefit hugely from data from new satellites such as Copernicus and it is estimated that this is worth €2.8 billion market in the EU.  For more information on the Copernicus mission and its recent launches of Sentinel satellites, please refer to the ESA website.

The final session I attended was on Survey operations and system integration.  The first talk by Dipaneeta Das was well delivered but I felt it was pitched at the wrong level. Much of the time was spent explaining web mapping but I suspect nearly all of the attendees already knew about the advantages web mapping offers for disseminating information to the public.  The other two talks were really interesting and focused on data acquisition.  John McCreedy (IIC Technology) walked the group through the pros and cons of various survey techniques including Laser Scanning, Lidar and structured light (think Xbox Kinect). One interesting snippet that came out was that often aerial photography captured more detail than other “newer� techniques.  This sentiment was echoed by James Eddy (Bluesky) who continue to collect hi-res aerial photography of the UK and beyond.  You can even collect aerial images at night.  Why you might ask?  Well to capture information about light pollution and to monitor “dark spots� in cities. This information can then feed into spatial analysis on crime and anti-social behaviour helping the police and councils target resources.

BlueskyNight

Bluesky’s Night Aerial Images – courtesy of Bluesky

The takeaway message from this session was that clients are increasingly specifying technology when commissioning surveys. This may not be wise and it is often better to specify what they expect as a final product and leave decisions on which technology to the experts who will ensure that the most appropriate technology is selected.  I suppose that is, and always has been, the role of the expert in any field.

Summary

GeoBusiness 2014 seemed to be a success.  The talks were interesting, the audiences engaged and you could see that there was a whole heap of networking going on.  I will write a more detailed post on how I see this event in terms of the academic sector, but it just remains for me to thank the conference team for putting together a great event.  I am looking forward to GeoBusiness 2015.

GeoBusiness 2014 – a preview

GeoBusiness_smallGeoBusiness 2014 is less than a week away.  This is a new event and I am looking forward to seeing what it will be like.  The organisers have certainly pushed the event, with short magazine inserts listing who is exhibiting and presenting.  GoGeo will be there and i thought i would explain why we are attending and what we hope to get out of the event.

It’s new and it’s big

Pretty self-explanatory, but also significant.  This is a chance to speak to all the major software vendors and find out what enhancements they have in the development.  In addition, there are a host of companies that offer GI service.  I want to see what these are up to and report on what looks innovative and interesting.  These companies collectively employ a significant number of GIS graduates each year.  Many of them are exploiting new and emerging technology such as unmanned aerial vehicles (UAV).  As such, they are really quite dynamic places to be employed as a fresh-faced graduate.

Workshops

There are a number of interesting workshops being run by companies to highlight what innovative analysis they are doing. There seems to be a clusters of workshops around 3D laser scanning, UAV’s and Business Information Modeling (BIM).  There is also a strand that focuses on professional development.

Content

Content for GoGeo and perhaps even ShareGeo.  So that means news articles, blog posts and so on for  GoGeo.  With ShareGeo it would be great to get some sample data from some companies so that lecturers could use this in their lessons.  I will be looking to convince some of the UAV and scanning companies to give some data with ShareGeo.  If you don’t know what ShareGeo is, it is a repository for open geo-spatial data that enhances teaching, learning and research.

So if you already have a ticket I might see you there. If you don’t have a ticket, there is still time and there are special rates for students (£25 per day if you pre-book).  Students, do your research on the companies attending and speak to people to find out what they do, it is a great opportunity to see the diverse range of jobs that is available in the GI market.

Geobusiness 2014 website

The search for Flight 370

flight370

courtesy of Wayan Vota (https://www.flickr.com/photos/dcmetroblogger/)

As the search for missing Malaysian Airways Flight 370 approaches it’s 5th week, the reliance of Geospatial technology and the skills to analyse large volumes of data are becoming increasingly clear. In this post we will look at some of the geospatial technology and techniques that have been used in the search for Flight 370.

Background

Flight-370 disappeared on the 8th of March 2014 having left Kuala Lumpur en-route for Beijing. There was simply no trace of it. Communications were lost somewhere over the Gulf of Thailand. Speculation quickly rose as to the fate of the aircraft with hijack and rouge pilots being muted as possible explanations.  A catastrophic break-up of the aircraft through an explosion was not ruled out but looked unlikely as this would generally be noticed. Furthermore, there was no sign of debris in the area of Flight 370 last known position.

Data feeds and extrapolation

After a few days, data started turning up that suggested that the plane had stayed aloft for several hours after all communication was lost.  Equipment onboard transmits information such as status updates and diagnostics.  The engineering teams can then monitor the health and performance of components while they are in use.

The engines had sent burst of data every hour and these had been picked up by a satellite operated by Inmarsat. By monitoring the Doppler effect in the received data, Inmarsat was able to chart 2 possible paths; one to the north and the other to the south.  This had never been done before and the innovative use of this data by Inmarsat allowed the rescue effort to be concentrated in 2 distinct areas.

After a bit of tweaking and refining, the Inmarsat scientists were able to discount the Northern corridor and the search re-focused on the Southern corridor, a vast expanse of ocean west of Australia with no suitable landing site.  How they achieved this was really quite clever. They used “truthing data” from other aircraft to monitor the Doppler effect and therefore refine their estimates for Flight 370. They then calculated the speed and altitude of the aircraft and were able to work out roughly where it would have run out of fuel and ditched into the ocean.  This greatly reduced the search area.

Satellite analysis

The search area had been focused to a small section of ocean (ok, so small in this case means the size of Western Europe, but given the size of the Southern Indian Ocean this can be considered to be small).  It was now feasible to start analysing aerial imagery to try and identify debris (given that there was nowhere for the plane to land, on the 24th March Malaysian officials announced that it was beyond reasonable doubt that the plane was lost after ditching in the Southern Indian Ocean). Trawling around to find out what satellites were used was harder than i thought it would be.  Below is a summary of what i found:

  • GAOFEN-1 – a high-resolution optical sensor run by CNSA (Chinese National Space Administration) which was launched in April 2013. Gaofen-1 is equipped with a 2 metre resolution CCD (Charge-Coupled Device), an 8 metre resolution multi-spectral scanner and 16 meter resolution wide-field multi-spectral imager. It is difficult to tell which sensor produced the image below, but from the resolution it looks like it was the 8m res multi-spectral scanner.
chinese satellite

Chinese satellite image of possible debris – Pic from The Guardian/Reuters

  • A French satellite operated by Airbus Defense and Space spotted 122 objects in a cluster. The objects were up to 23m in length and in a cluster. (image released by MOSTI). Airbus Defense and space have a host of satellites run through their Astrium including EnviSAT, CryoSAT, Copernicus, ELISA and Helios 2.
French

Airbus Defence Image

  • Australian AP-3C Orion – Orion aircraft were deployed to likely search areas and scanned the area.  It is likely that the crew were using a combination of electronic surveillance system and just their eyes. This might seem like old-school, but it is an effective method of verification as trained operators can discount or confirm sightings from remote sensing. The aircraft has a long-range and can fly low making it ideal for searching.

Ocean Currents

Why has it taken so long to refine the search area?  Well there are lots of satellites, but only a few of them would have had suitable sensors on-board. Data is collected and beamed back to a receiving centre. The raw data will most probably have to be processed before it can be used for anything.  This takes time.  The search area may well have been narrowed to a chunk of the southern Indian Ocean, but this still represents a huge area, not dissimilar to the size of Western Europe.  Processing and analysing data for such a large area is not easy and will rely on a degree of automation followed by humba verification.

The southern Ocean is a wild place with frequent storms. We can see from above the at optical sensors have been used and these will be unable to penetrate cloud cover. Scientists would have to wait for the satellite to pass over the same area to try and get a better, cloud-free image. The repeat cycle may be anything from 1 day to 10 days or more.

Then you add in the ocean currents.  Anything object floating in the ocean will not be static and could drift by 10′s of kilometres a day. Given that the plane is likely to have crashed 15 days previously, debris could be 100′s of kilometers from the crash site. That is, if it has not already broken up and sunk.  But we can at least model the ocean currents and estimate the potential dispersal of the debris.  The NY Times have some excellent visualisations of both the currents and the wave heights in the southern Indian Ocean during March.  These have been produced by the National Oceanic and Atmospheric Administration and the National Centers for Environmental Prediction through remote sensing data, in-situ data (buoys) and models.  While never 100% accurate, they provide an indication and convey the uncertainty involved in determining a search area.

Locating flight recorders

Once a search area has been identified, the searchers are able to deploy listening devices which locate “pings” emitted by Flight 370′s black box. This is achieved by towing a listening device (TLP-25) back and forth across a wide area.  Pings should be received periodically and the position and strength of these should triangulate the position of the black box. But the sea floor is not flat in this area. It is around 4500m deep with mountains up to 2500m high.  We actually know very little about remote ocean sea beds.  We have limited data collected by ships and most representations come from spaceborne remote sensing data. These are not very accurate and may “miss” large structures (1-2km high) such as seamounts. There is nice overview of ocean mapping on the BBC website.

The difficulties of retrieving debris from deep, remote oceans was highlighted by the search for French Airlines flight 447.  In this case, both black box transmitters failed.

A Chinese ship detected a ping on the 5th April and a day later an Australian ship detected a ping.  But the pings were quite far apart.  The Australian ships detection seemed more consistent and stronger and this was backed up by more detections in the same area on the 8th. It is a slow process, but each detection should help reduce the uncertainty.  The question is, will the batteries in the transponders last much longer?  They are already at the limit of what is expected so time is running out.

Remote Sensing Critical

It is clear that remote sensing technology has been critical in every stage of the search for Flight 370.  It will continue to be so until the plane is found.  It has been used effectively to narrow search areas and discount blind alleys. It is also interesting to note how associated data has been used in ways that it was not intended to locate the plane and praise should be given to the Inmarsat scientists who came up with a novel solution when data and information was scarce.

Articles:

  • The search for Malaysian Airlines Flight 370 – a great article in the New York Times that focuses on the remote sensing data that is being used now that search teams have identified a “likely” crash site in the Southern Indian Ocean.
  • Wikipedia – a growing resource for information about Flight 370
  • Wikipedia – French Airways flight 447
  • NY Times – nice collection of visualisations of ocean conditions in the search area

Space Charter activated in response to UK flooding

Flood Warning

Spotted on the BBC News page, the recent flooding in the UK (yes, it has been even wetter than normal over here!) has prompted the UK Government to activate the global charter on space and natural disasters. This essentially gives government agencies access to the most up-to-date imagery of affected areas allowing them to plan relief and contingencies.

There is no end in sight for the bad weather which is being driven by a very strong jet-stream. This has resulted in a number of deep depressions passing over the UK battering the coasts and dropping lots of precipitation.  Depressions are not unusual at this time of year, but the frequency and intensity has given the rivers little time to recover before the next assault.

Useful links:

Flood

Flood – Pic courtesy of Johndal (http://www.flickr.com/photos/johndal)

FOSS4G 2013 – 5 reasons you should attend

FOSS4G is the annual conference for anyone interested in Free and Open Source Software 4 Geospatial.  FOSS4G 2013 will be held in Nottingham between the 17th and 21st September. So what makes FOSS4G so important and why should you attend?

  1. Network – FOSS4G is the biggest gathering of developers and users of open geospatial software.  There will be over 700 people at the conference. This includes the lead developers on some of the larger open source projects such as OpenLayers and QGIS.
  2. Learn – You’ll learn a lot in a very short period of time.  No matter what your knowledge of open source geo from beginner to expert coder/developer you will learn something new at FOSS4G.  There are workshops for all levels that you can sign up to.
  3. Inspiration – You will be inspired by some of the major names in GIS and data analysis. The list of keynote speakers includes Paul Ramsey (co-founder of PostGIS), Kate Chapman (Acting Director of humanitarian team at OpenStreetMap) and Ben Hennig (Worldmapper Project).  For a full list of Keynote speakers, please refer to the FOSS4G keynote page.
  4. Double the fun – Visit AGI GeoCommunity’13 at the same time. Yes, that’s right FOSS4G and AGI GeoCommunity are happening in the same venue on the same week. This was no accident. GeoCommunity is a great event and the FOSS4G organisers wanted to bring the two audiences together. GeoCommunity’13 runs from the 16th to the 18th September.
  5. Can you afford to miss it?  – What does this mean?  Well, the conference package is quite reasonable given the number and diversity of talks on offer.  £165 for a day pass or £435 for the whole event (3 days and the FOSS4G Gala Night).  FOSS4G was last in Europe back in 2010 and it might not be back until 2017 as it moves between continents. So, if you are based in Europe attending FOSS4G might not be as easy for a number of years.

So, there are 5 pretty good reasons to attend.  I am sure there are many other reasons to come along.  To find out everything that will be going on at FOSS4G please look at the conference website and follow the event on twitter through the #FOSS4G hashtag.

FOSS4G 2013 takes place between the 17th – 21st September 2013 and will be held at the East Midlands Conference Centre, which is situated on The University of Nottingham campus. 

How highs the water mama? 3 feet high and rising

A great line from De La Soul, but on a serious note, sea level is rising and scientists don’t seem confident that the trend is going to slow or reverse anytime soon.  Over the past month there has been a constant stream of news articles reporting on record low-levels of sea-ice at the end of the melt season and ships short-cutting through the Arctic to reach the Pacific.

It would seem that the writing is on the wall for Arctic sea-ice.  The change in albedo alone will only accelerate the decline in ice extent.  The exposed ocean will absorb more heat from direct solar radiation than the sea ice which reflects a lot of the radiation and as the ocean warms, the sea ice decline will accelerate.

The accelerated melting of the Arctic sea-ice is significant, but at least we know that this will not result in significant increase in global sea levels (beyond the general expansion of water as it is heated). Those worried about global sea levels are really focused on the large ice sheets of Greenland and Antarctica which, if melted completely, would contribute 7m and 61m respectively.  68m of sea level rise.  That is huge.  But what would it look like?  Well there is a very neat interactive map that allows you to visualise future sea levels. Go on, scare yourself.  Look how recognisable coast lines change as sea levels increase.  You can wipe The Netherlands and Denmark off the map. Click the image below for an overview.

Link to Sea-level Map

Link to ESA article on Polar Ice Loss

 

Curiosity

 

Curiosity

This morning NASA successfully landed a probe on Mars. After a 9 month journey, the final 7 minutes of the trip were set to be the most risky, but the probe landed safely and has started transmitting data back to mission control. The probe is called Curiosity and is about the size of a small car. The purpose of Curiosity is to investigate the possibility of previous microbial life on Mars. If you are interested in finding out more about the mission then there are a number of links that provide further details:

If you have any more interesting links to the Curiosity mission, please add them as comments and i will add them to the list.

The Venus Transit – NASA Video

Well i was up in time to find the skies above Scotland giving it the full 8 Octas this morning.  So no transit viewing for me and I am unlikely to see it again given that we will have to wait another 105 years for the next transect.

If you missed it then you might be in interested in watching the truly amazing video posed on NASA’s YouTube channel.  This compresses the 7 hour transit into 39 seconds of jaw-dropping video.

Transit of Venus

From the NASA website: The Venus transit as seen in the 171 wavelength. This channel is especially good at showing coronal loops – the arcs extending off of the Sun where plasma moves along magnetic field lines. The brightest spots seen here are locations where the magnetic field near the surface is exceptionally strong.

If you don’t know why the transit of Venus is so important to the development of science then you should check out the following links:

GISRUK 2012 – Thursday

The second part of GoGeo’s review of GISRUK 2012 covers Thursday. If you want to find out what happened on Wednesday, please read this post

Thusrday saw a full programme of talks split between two parallel sessions.  I chose to go to the Landscape Visibility and Visualisation strand.

  • Steve Carver (University of Leeds) started proceedings with No High Ground: visualising Scotland’s renewable landscape using rapid viewshed assessment tools. This talk brought together new modeling software that allowed for multiple viewsheds to be analysied very quickly, with a practical and topical subject.  The SNP want Scotland to be self-sufficient with renewable energy by 2020.  An ambitious target. In 2009, 42% of Scotlands “views” were unaffected by human developments, this had declined to 28% by 2011.  Wind farms are threatening the “wildness” of Scotland and this may have implications on tourism.  Interestingly, the SNP also wants to double the income from tourism by 2020. So how can you achieve both?  By siting new wind farms in areas that do not further impact on the remaining wild areas.  This requires fast and efficient analysis of viewsheds which is what Steve and his team presented.
  • Sam Meek (University of Nottingham) was next up presenting on The influence of digital surface models choice on the visibility-based mobile geospatial application.  Sam’s research focused onan application called Zapp.  Sam is looking at how to efficiently and accuretly run visibility models on mobile devices in the field and how the results are influenced by the surface model.  In each case, all processing is done on the device. Resampling detailed DTM’s is obviously going to make processing less intensive, however this often leads to issues such as smoothing of features.  Other general issues with visibility models are stepping, where edges form in the DTM and interupt the line of sight and an over estimation of vegetation.  This research should help make navigation apps on mobiles that use visual landmarks to guide the user, more accurate and usable.
  • Possibly the strangest and most intruging paper title at GISRUK 2012 came from Neil Sang (Sweedish University of Argicultural Science) with New Horizons for the Standford Bunny – A novel method for view analysis.  The “bunny” reference was a bit of a red herring but the research did look at horizon based view analysis.  The essence was to identify horizons in a landscape to improve the speed of viewshed analysis as the horizons often persisted even when the local position changed.
  • The final paper of the session took a different direction with David Miller of The James Hutton Institute looking at Testing the publics preferences for future. This linked public policy with public consultations through the use of virtual reality environments.  The research investigated whether familiarity with the location altered the opinion of planned changes to the landscapes.  Findings showed agreement in developing amenity woodland adjacent to a village, and environmental protection, but differences arose in relation to proposals for medium-sized windfarms (note – medium-sized wind farms are defined as those that would perhaps be constructed to supply power to a farm and not commercial windfarms).

After coffee I chose to go to the Qualitative GIS session as it provided an interesting mix f papers that explored social media and enabling”the crowd”.

  • First up was Amy Fowler (Lancaster University) who asked How reliable is citized-derived scientific data?  This research looked at the prevelance of aircraft contrails using data derived through the Open Air Laboratories (OPAL) Climate Survey. Given the dynamic nature of the atmosphere, it is impossible to validate user contributed data. Amy hopes to script an automated confidence calculator to analyse nearly 9,000 observations, but initial analysis suggests that observations that have accompanying photographs tend to be more reliable.
  • Iain Dillingham (City University) looked at Characterising Locality Descriptors in crowd-sourced information.  This specifically focused on humanitarian organisations. Using the wealth of data available from the 2010 Haiti earthquake they investigated the uncertainty of location from social media. They looked at georeferencing locality descriptors in MaNIS (Mammal Network Information System).  The conclusion was that while there were similarities in the datasets, the crowd-sourced data presented significant challenges with respect to vagueness, ambiguity and precision.
  • The next presentation changed the focus somewhat, Scott Orford (Cardiff University) presented his work on Mapping interview transcript records: technical, theoretical and cartographical challenges. This research formed part of the WISERD project and aimed to geo-tag interview transcripts .  Geo-tagging was done using UNLOCK but there were several issues with getting useful results out, or reducing the noise in the data.  Interview scripts were transcribed in England and complicated Welsh placename spellings often got transcribed incorrectly.  In addition, phrases such as “Erm” were quite frequent and got parsed which then had to be removed as they did not actually relate to a place. Interesting patterns did emerge about what areas appeared to be of interest to different people in different regions of Wales, however care had to be taken in preparing the dataset and parsing it.
  • Chris Parker (Loughborough University) looked at Using VGI in design for online usability: the case of access information. Chris used a number of volunteers to collect data on accessibility to public transport. The volunteers might be considered an expert group as they were all wheel-chair users.  Comparison was made between an official map and one that used the VGI data. It was found that the public perception of quality increased when VGI data was used making it an attractive and useful option for improving the confidence of online information. However, it would be interesting to look at this issue with a more mixed crowd of volunteer, rather than just the expert user group who seemed to have been commission (but not paid) to collect specific information. I am also not too sure where the term Usability from the title fits.  Trusting the source of online data may increase it use but this is not usability which refers more to the ability of users to engage with and perform tasks on an interface.

There was a good demonstration from ESRI UK of their ArcGIS.com service.  This allows users to upload their own data, theme it and display it against one of a number of background maps. The service then allows you to publish the map and restrict the access to the map by creating groups.  Users can also embed the map into a website by copying some code that is automatically created for you. All good stuff, if you want to find out more about this then have a look at the ArcGIS.com website.

Most of Friday was given over to celebrating the career of Stan Openshaw.  I didn’t work with Stan but it is clear from the presentations that he made a significant contribution to the developing field of GIS and spatial analysis and had a huge effect on the development of many of the researchers that regularly attend GISRUK.  If you want to find out more about Stan’s career, have a look at the Stan Openshaw Collection website.

Friday’s keynote was given by Tyler Mitchel who was representing the OSGeo community.    Tyler was a key force in the development of the OSGeo group and has championed the use of open software in gis.  Tyler’s presentation focused on interoprability and standards and how they combine to allow you to create a software stack that can easily meet you GIS needs.  I will try to get a copy of the slides of Tyler’s presentation and link to them from here.