2012 FOSS4G-CEE Conference

Long time no post. Well the best things come to those that wait and today we have a guest blog from fellow EDINA Geodata team member James Crone. James attended the recent FOSS4G-CEE Conference which was held at the Faculty of Civil Engineering, Czech Technical University in Prague between the 21st and 23rd of May. Over to James…..

Seen as an add-on to the global FOSS4G conference which attracts developers and users of open source geospatial software as well as managers and decision-makers and which will be held in Beijing this year, FOSS4G-CEE has a regional focus on all things open source and geospatial in Central and Eastern Europe.  The official language of FOSS4G-CEE was English.

The conference consisted of workshops followed by parallel presentation/tutorial streams, unconference birds of a feather sessions and post-conference code sprints. I only attended the presentation streams which ran from Monday afternoon through to Wednesday.

The Plenary session on Monday consisted of introductory talks on different strands of what is meant by Open. Arnulf Christl of OSGeo/metaspatial covered open software; Athina Trakas of the Open Geospatial Consortium covered open standards whilst Markus Neteler of Edmund Mach Foundation covered open science. A local Central and East Europe flavour was provided by Jiri Polacek of the Czech Office for Surveying, Mapping and Cadastre who covered cadastre and INSPIRE in the Czech Republic and Vasile Craciunescu of the Romanian National Meteorological Administration / geo-spatial.org who provided an overview of open source software projects, applications and research projects using open source geospatial software in the Central and Eastern Europe region.

On Tuesday through to Wednesday the presentations proper started. Thematically the presentations were grouped around the themes of INSPIRE, Case studies of the use of geospatial FOSS, Geoinformatics and the more technical data / development ones. As an opportunity to track changes regarding open geosptial software itself I mostly attended the technical data/development presentations.

There were many awesome things presented during FOSS4G-CEE but my top three were:

1. MapServer

EDINA have been using MapServer, the open source platform for publishing spatial data to the web for some time. The next release of MapServer 6.2 is promising improved cartography, map caching and feature serving. The first two of these were covered in two talks by Thomas Bonfort of Terriscope.

In Advanced Cartography with MapServer 6.2, Thomas described some of the improved features that will be available when it comes to rendering vector data through MapServer. Some of the nice things that will be included are improved support for complex symbols and improvements to feature labeling.

Nobody likes waiting for their maps. In a second presentation, MapServer MapCache, the fast tile serving solution, Thomas described MapServer MapCache which provides all of the features of a certain tilecaching system with added goodness in the form of increased performance, native MapServer sources without the overhead of going through a WMS and configuration directly within the mapfile.

MapServer 6.2 certainly seems like it could be a release to watch for.

2. PostGIS Topology

Vincent Picavet of Oslandia provided an introduction to graphs and topology in PostGIS.

Here at EDINA we use PostGIS extensively within such services as UKBORDERS and Digimap. Within our UKBORDERS service we provide academics with access to digital boundary datasets. As a result we`ve been tracking with a great deal of interest developments in the storage of topology within PostGIS. The benefits of using PostGIS topology are that we can store shared boundaries which is good for data normalisation and has benefits when it comes to the generalisation of boundary datasets. These and network operations such as routing were demonstrated in Vincent’s very informative talk.

Although not related to topology, in a later talk Vincent presented Efficiently using PostGIS with QGIS and mentioned numerous extremely useful features and plugins for QGIS for working with PostGIS. Once back in the EDINA office I duly installed the Fast SQL Layer plugin which has made working with PostGIS in QGIS even nicer than it was before.

3. TinyOWS

The talk TinyOWS, the high performance WFS Server by Vincent Picavet of Oslandia, showcased some of the features of TinyOWS. TinyOWS provides a lightweight, fast implementation of the OGC WFS-T standard. Tightly coupled to PostGIS, TinyOWS will be released as part of MapServer 6.2.

Real world use of TinyOWS was demonstrated in talk held during a wednesday morning session titled

IPA-Online, an application built on FOSS to assist Romanian farmers to prepare their application form for direct payments. by Boris Leukert.

The IPA-Online system allows Romanian farmers to prepare single area payment applications by drawing parcel boundaries in an online application to support EU subsidy payments and replaces a previously manual system of drawing the parcels on paper maps. Built around MapBender/MapServer/PostgreSQL/PostGIS with TinyOWS used to provide WFS-T and allowed for a very large number of concurrent users. The conclusion from Boris was that deployment of a system based on geospatial FOSS brought with it savings of time, money and the environment, saving the need for 1.6 million less paper maps having to be printed.

Overall attendance at FOSS4G-CEE was very worthwhile. Slides for these and other talks are available online for viewing over at the FOSS4G-CEE homepage.

Lessons Learned

Most of this has been covered in the previous post but it would be good to extract a number of key things that we have learned through the USeD project.

  1. usability can save you time and money during the development of a new application
  2. external consultants can be an effective way of buying in skills if you do not have them “in house”
  3. external consultants can be used to up-skill project staff
  4. however well you think you know your users/sector, engaging with users will always reveal something unexpected
  5. users may be using your service for something other than it’s primary purpose. This may be because they don’t know there is another service that would be better suited, or that your service is the best thing out there that almost does what they want
  6. personas work, even the contrived names such a Explorer Evie or Work-around Walter.  These make it easier to discuss issues and problems with the project team and relate them back to a “real” user.
  7. user testing points out the blindingly obvious which was not obvious until you started testing
  8. you can salvage something from a user test even if it seems to be going badly wrong
  9. you don’t need more than 5-6 user to test an interface, by the 4th person you are uncovering very little in the way of new issues.
  10. write up user tests immediately, important information seeps out of your mind in a short space of time
  11. usability labs need not be expensive
  12. effective documentation makes buy-in from stakeholders much easier.

I think i will leave it there, I may come back to this list and add a couple more items.

Project Recap

With the USeD project drawing to a close it is a good time to recap on what we set out to achieve and how we went about it.


The USeD project aimed to improve the usability and learnability of a user interface which enabled users to download spatial data from the Digimap service. The current data downloader is a popular service but is perhaps not the most user friendly.  It was designed around the technical constraints of the software of he time (2002) and the requirement that it had to integrate with an inflexible server-side database.

It’s replacement would be designed around the needs of the user but would still have to integrate with a server-side database. However, the new database is more flexible and data extraction far simpler.

The new interface must serve all users from experienced users who know what they want to complete novices who are perhaps less confident working with spatial data.  The interface should therefore be intuitive and learnable, allowing users to explore some of the advanced functionality as they gain confidence. You can read a detailed summary on the  About USeD page.


The first task was to interview some users and create a set of user personas. 20 users were interviewed and this resulted in 5 distinct personas.  The personas would be used to shape the user requirements and would be used to steer the design of the interface throughout the project.  You can meet our personas on the persona page.

Design Specification

The design specification can be divided into 2 parts; user requirements and a technical specification.  The user requirements were defined from the user requirements.  In the personas we had created a list of “pesron X wants to” and “We would like person X to”.  which made it quite a simple task to put together an initial list of requirements.  We grouped the requirements into:

  1. a user must be able to
  2. a user should be able to
  3. a user could be able to

Grouping the requirements like this gave the engineers an idea of the importance of each requirement which made it easier to justify spending more time implementing small functions that were deemed to be a must. The user requirements documentation can be found here

The technical review focused on the software and libraries that could be used to make the new interface more attractive and interactive. The server side database had already been updated so new tech had to integrate with this.

Prototype or Full Build?

This was a key question in the project. Do we use wire-frame mockups to show different designs or do we use a fully functioning test site?  We went with the fll build as we suspected that there would be issues surrounding the strong visual map window and the expectation of what the used would receive in their order. It was felt that a wire-frame would not address these issues. Building fully functioning test sites involved far more developer time and effort, but it was certainly worth it.

Iterative User Testing

We used task based testing to explore the usability of the new interface.  We started with an expert review from our usability consultant, this caught a number of issues that we had missed. The task based testing engaged with real users. Each user had 45 mins to complete a number of tasks and we tried to have 6 people per session. The interface was then modified between sessions. We ran 3 sessions and saw our “problem points” migrate from the initial screen through the ordering process. This was encouraging as it suggested that users were able to progress further in progressive session before they ran into problems. The user testing is described in detail in a number of post.

Project hand-over


Handover – Tableatny @ Flickr

At the end of the project we will hand over our findings to the Digimap Service team. The hand-over will be a document that outlines a number of improvements that can be made to the existing interface. Each recommendation will be ranked as either High, Medium or Low.  Each recemondation will address an identified isue in the current interface and will suggest a solution which has been implimented and tested during the USeD project.  Where multiple solutions were trialed, a brief summary of this will be given to justify the final suggestion.

This style of documentation proved to be a very effective way of suggesting improvements to the development team.


Version 4 User Testing

The final round of interface testing will follow the same format as the previous sessions.  6 candidates will run through a series of tasks designed to test the usability of the interface.  Once again, candidates were selected from a list of registered Digimap users.  The main findings of this testing session are summarised below:

1.  Text Search

The “No results message boxâ€�  should include the following text “No results found for ‘dasas’ please check the spelling or try an alternative. You can search by Place name, Postcode or Grid refâ€�

The button used to close the search box currently says “Select and Close�  Several users found the term Select confusing. Change this to be “Close� and fix the tool tip.

2. Draw Rectangle

There were a couple of issues with this. The default function should always be pan, however it is possible to have draw rectangle and use coordinates/use time name selected. You should only have 1 select function active at any time.  A user selected a tile through use tilename and then returned to the map and wanted to pan, but their first click cleared the selection as the draw rectangle button was still active.

A wider issue to think about is that does the absence of a pan button confuse users and prevent them from panning? Or, is the current system learnable?  We could improve the help and the tool tip to improve the learnability of this toggle “ON – Draw rectangle to select data. OFF – Pan the mapâ€�

3. Add to basket error

Change txt to say “You have too much 1:10 000 Raster data in your basket, the limit is 200 tiles.  Either reduce your selected area or select another product�

4.  My Account

Further refinements are needed in the My Account section.  The Green Envelope and Blue rubbish bin worked well visually.  These should be the only clickable elements in each row.  Once selected, the bottom grid should populate and if the order botton is pressed this will re-order the complete order. Only if the user checks one of the check boxes will the order be split. So, all the radio buttons should be checked when the bottom grid is populated.

5. Preview

Add in a preview for datasets that are UK wide.  The lack of a preview confused more than one candidate. Tooltip on Preview is not right

6. Use Coordinates

Order of information is now confusing.  Map example was useful but the input boxes should sit below this image.  The OR options can then sit below the input boxes. We also need an error box on “Get coordinates form selected area� to catch when users have no area selected.

7. Use Time Name

Change the text below the txt input box to read “Click the “� icon on the right of the map to view tile grids at anytime.


Overall, Version 4 user testing was quite encouraging. No major issues were discovered. The feedback from the users was positive and the issues that were identified were generally quite small.  They focus on things that would make the interface clearer and more learnable.

The plan now is to collate the findings from the usability testing and produce a number of recommendations on how to improve the version of the data downloader that is currently live as a beta.  Recommendations will be supported by the evidence gathered during this user testing program.


Usability lab on a shoestring budget

Usability testing should be an important part of the development of any user interface. Ensuring that the interface is intuitive and easy to use is critical for its success. However, running usability sessions with real users often strikes fear into project teams. They assume that it will be a costly and time consuming process and will confuse as much as it clarifies the design process.  This article aims to demonstrate how easy it is to set up an effective usability lab on a shoestring budget.


The USeD project aims to improve the interface of a data download website which provides   spatial data to the education sector in the UK.  User testing is an integral part of the USeD project and carrying out iterative assessment exercises will drive the development of the interface.  However, the project budget is quite modest and most of it is assigned for designing and coding the interface.

A discussion with our usability expert on the usefulness of various techniques suggested that most issues with an interface could be identified using quite simple techniques such as task-based exercises. Eye tracking allows testing to focus on very specific problems and it was better to identify general issues first before considering advanced techniques.

User Task Based Testing

Task based testing centers around setting users a series of small, distinct tasks that have been designed to test the functionality of an interface.  The initial tasks should be quite straight forward but later ones can be more involved allowing sessions to explore more advanced aspects of the interface.  Tasks should give the user a clear understanding of what they want to achieve but should allow them the flexibility to explore the interface. This flexibility can reveal how users discover functionality in the interface.  In these testing sessions we have 6 tasks and each session will last up to 45 minutes. Any longer than this and it is probably that the user will tire and loose focus.

So, how can you set up an effective user testing lab in your own office using pretty much “stuff” that you find lying around or “borrow”, temporarily?  The recipe below describes how we went about the task.


  • 2 rooms, close together or preferably next to each other
  • 2 computers
  • 3 screens
  • 1 web cam
  • 1 mic
  • 1 set of baby monitor
  • A sprinkle of free software
  • 1 really helpful systems support person

First of all, having two rooms is a huge benefit as it means that the only the candidate and the facilitator (person running the test) need to be in the test room. This reduces the stress on the user during the test so that it feels less like a test. A nervous or flustered user will not interact with the interface in a naturally which may affect the results of the tasks.  Having the rooms next together makes things much easier as you can run cables between them.

Test lab

Test Room

  • Set up a computer that is typical of the ones you expect users to access the interface through in normal use. If users are likely to use a laptop or a 15 inch monitor, it would be unfair to run the test on a 21 inch monitor.
  • Set up a web cam that shows the user and the facilitator. This should be set up in an unobtrusive way and is to monitor general body language rather than detailed facial expressions or eye movements.
  • Position the transmitting part of the baby monitor so that it will pick up the conversation
  • Place a microphone dictaphone to capture the conversation between the candidate and the facilitator. This is really just a back up in case parts of the conversation get missed.
  • Make sure you provide some water for the candidates and a bit of chocolate never hurts.

Observation room

The observation lab can be set up in various ways but if you have access to two monitors then this makes things easier.

  • Set up the computer with a “Yâ€� splitter to two monitors. Monitor 1 will show the users screen and monitor 2 will display the webcam feed.  Set the monitors up about 1.5m away from the observers.  This will give them room to make notes and setting the back a bit means that they can easily scan both monitors at the same time without the “watching tennis” effect.
  • The receiving part of the baby monitor will provide the live audio from the other room.
  • Remember some water and chocolate or sugary sweets to keep the observers alert


Observation room

Porting the display

To display the users screen, we used some free software called “Zonescreen�. This has to be installed on both computers. Once installed, start ZoneScreen on the machine in the user lab, set this to as the HOST. Make a note of the i.p address. On the computer in the observation room, start ZoneScreen and set the session to REMOTE and enter the i.p address of the computer in the other room. You should now be able to see everything that happens on the user computer.


The webcam feed is a little bit trickier. We experimented with broadcasting this across our network, but there was often a lag of up to 20-30seconds which made it very difficult to follow what was actually going on. As we had the luxury of having two rooms next to each other, we were able to connect the webcam to the computer in the observation lab. To do this you need a powered USB extension. The 10m extension we used occasionally failed, possibly as the power attenuated along its length. Replacing this with a 5m cable solved the problem.


This set up worked really well.  The observers were able to see the candidates screen, hear everything that was said.  The webcam was useful to give everything context.  You could tell when the candidate had turned to speak to the facilitator and you could monitor their general body language.  There was only the slightest delay on the screen display feed, but this did not cause a problem. The baby monitors might seem very low tech but they are reliable and effective.

So, what did all this cost?  All the software was free and well we scavinged everything except the 5m powered usb cable and the baby monitors.  The total cost of this equipment was £40.  A huge thanks to Nik, EDINA’s small system support officer, who managed to find the software and put the lab together.

IGIBS Followon and use of Underspend

Its a bit early to be making predictions about how IGIBS might evolve, but a recent presentation to the EDINA geoteam followed by some discussion indicated some of the possibilities.

  • The WMS Factory Tool.  With the simple but effective styling capability that Michael Koutroumpas engineered, I think we have a prototype thats not too far off a production strength tool.  There are loads of scenarios where its valuable to have access to a tool that makes it easy to see your “non-interoperable” data alongside the growing number of INSPIRE View Services (read WMS) from public authorities across Europe going online.  So top of my list is improving this tools styling capability.
  • Associated with this would be better understanding of necessary data publication infrastructure, eg, making it easy to use the other OGC Web Services.  Something like the GEOSS Service Factory ideas emerging from the EuroGEOSS project.  I think there is a real demand for tools to make it easy to use the OGC standards.
  • In the immediate future, I think its likely that the IGIBS team will do some promotion of the project outputs, eg:
    • presenting the project at relevant events, eg, Association GI Laboratories Europe conference, OGC Technical Committee meetings.  This might cost as little as £500 depending on where the event is.
    • use of social media to promote both the WMS Factory Tool and the report on “Best Practice Interaction with the UK Academic Spatial Data Infrastructure”.  This too could cost as little as an additional £500.
  • The latter report is worthy of a lot more investment.  A major output from this project, possibly the single most important output, is the increase in use of UK academic SDI services within the Institute of Geography and Earth Science (IGES) at Aberystwyth University.  IGES is acting as an exemplar for best practice research data management around geospatial data, the department is actively building on the IGIBS work and it will be interesting to see how it develops and if other departments in other institutions see the benefit and start to emulate what Aberystwyth is doing.  More work promoting Steve Walsh’s report would help.

Downloader Version 2

Working on the recommendations from the first round of user testing (expert review by the usability expert), Version 2 of the Data Downloader has been developed.

Data Downloader Version 2

The main improvements that have been implimented include:

  • isolate the search and pan/zoom function
  • add select visible area button
  • tidy up the products list
  • increase the prominence of the “add to basket” button
  • add “on hover” functionality to the rubbish bin and the info icons, make them change colour so users notice them.

The next step is to carry out some UI testing on this interface.  Organising external staff / students is time consuming so it has been decided that for the first round of UI testing we will use EDINA staff.  We should be able to replicate the personas from staff as we have people with a mix of geo knowledge.  The other advantage of this is that we will be able to use this round of testing as a dry run and get the usability Expert to observe the tests.  We will then get feedback and tips on running a UI test and hopefully improve our technique so that we get more out of tests with external users.

Recommendations from UI Testing 1

Based on the first round of UI testing and the report prepared by our usability expert, a number of suggested improvements were put forward for the data downloader.  Actually, the report prepared by David was very useful.  The report did a number of things that made it easy to discuss issues with the development team and identify the potential solution.  In particular, a UI testing report should:

  • Rank the reported issues  High, Medium and Low, you can decide to rank within these categories if you wish
  • provide an annotated screenshot
  • add supporting text description
  • suggest a solution, perhaps with a mock-up or link to another site that demonstrates the solution
  • report “bugs” separately
  • dont go overboard – listing 100 faults is not helpful. (note – hopefully you shouldn’t have 100+ faults if you have understood the who, what, why of the design brief)

The USeD development team were able to sit around a table and come up with a course of action on each one of the issues raised in the UI Expert report.  How long did this take? 60 minutes. That’s all, honest.  A well presented document that described each issue clearly meant the developer could concentrate on the technical solution and judge whether a solution could be implemented in the development time available. Only a couple of issues got “parked”.  This was generally to do with the capabilities of the software and hardware and implications of server load.

A list of the changes that we will make to the interface is given at the end of this post.  So what have we learnt from this initial review?

  1. the usability expert liked the new site so that was good.
  2. He pointed out several things that are now blindingly obvious to the USeD team.
  3. We now appreciate the benefits of presenting UI results clearly, they shape the development of the UI.
  4. We know that we can make the UI better and easier to interact with.
  5. We know what we have to focus on for version 2

We now have about 2 weeks of development time before we roll out Version 2 for some more UI testing.

Points that will be investigated for Version 2

  1. Remove “do one of the following”
  2. Isolate pan as it is a separate function to draw or select
  3. On load, products list should be collapsed
  4. Add better headers to products list such as “Product” “Allowance”
  5. Make it more obvious when an allowance is exceeded, perhaps red text or strike thru text.
  6. Add a modal box if a user selects a product for which the allowance has been exceeded
  7. Remove lock/unlock icon
  8. Make info button change on hover
  9. Add info about which data product is being viewed in the map window.
  10. Remove Scale-bar
  11. Define area buttons need cleaned up – perhaps 1 define button that opens a pop-up
  12. Add more descriptive help to functions like define Square in the popups
  13. Popup and modal boxes should close when “GO” is pressed
  14. In basket, change preview and trash can to change colour on hover
  15. Give the “Add to Basket” button more prominence.
  16. Greyed out buttons too subtle (add to basket/define circle(not functioning at present))
  17. Are arrows the best symbol to use for clickable function submit buttons?
  18. Possible to get lost in map, have a reset view (planned but functionality issues prevented it going out in Version 1)

Aspects that we could not take forward with an explination of why.

  • Change the product names to be more meaningful and descriptive – the OS are insistent that their products should described by the correct products name. We also agree that this is sensible as many products are quite similar in some ways, but different in others. Conveying this in 2-50 characters would be difficult.  Further, using the official product name means that Digimap users are using the correct terms that are used in the public and private sectors.
  • When a user selects a product, update the map window to show the product. This would make a stronger visual link between what users see and what they eventually get. The preview windows are too small to be very useful. – This would be possible for some products but initiating this to say preview Mastermap or other large scale mapping products.  The load on the server to facilitate this could potentially be considerable and impact on the speed of the service. In addition, maps would not render in a readable format on screen if users tried to view large areas of large scale maps.

Link to the usability report prepared by Usability Expert

INSPIRE and Universities: An update thanks to James Reid

After some fantastic help from James Reid at EDINA we thought to put together a blog post summarising some of the conclusions we have come to over INSPIRE.

At this stage it may be  worth having a look at my earlier but less informed post regarding INSPIRE to understand how my understanding of the issues has progressed.

For INSPIRE to be something that universities need to spend time and money complying with, then there are several questions needing an answer. We are not in a position to answer all of them with 100% certainty but with the help of James here are some conclusions we have come to.

1. Are universities  “public bodies” or more accurately public authorities? This appears to be one area that the fog has lifted from. The INSPIRE Regulations will only apply to public authorities and James has taken the trouble to check out this area with Edinburgh  and is certain that universities are public authorities for the purposes of INSPIRE.  So one “Yes” to INSPIRE

2. Do universities hold and control datasets that match the data described in any of the INSPIRE data Annexes? After looking through the datasets collected for the IBIBS project I have found 11 (or about 5% of them) that match up with some of the data themes in Annex iii. The IGIBS data is probably not  representative  of the total extent of data held by Aberystwyth University and a data inventory of data held by some of  the academic staff would be needed to quantify the amount of INSPIRE data held.  So another “Yes” to INSPIRE

3. What is the public task of a university?  Here is where the situation becomes less clear.  There appears to be no public task defined for universities. The problem seems to stem from the fact the universities are not covered by the PSI Regulations and therefore have not needed to define a public task for themselves.  Again James has made some progress on this and pointed to a publication from the National Archives that helps explain the process 0f defining a body’s public task.  There has also been some slightly ambiguous advice from the Scottish Information Commissioner that includes a suggestion that it may be relevant for a university to seek legal advice over the  issue. So there seems to be no clear answer to this question. A case of  “we dont know yet”

4. Do those data identified in 2 above relate to the public task of the university? Again until we know the answer to 3 above  we can only guess at the answer to this question. Commonsense suggests that research and teaching must be part of the task if it is ever defined. So my guess would be a “probable Yes”

5. Will there be any attempt to enforce the regulations? Again no way of knowing the answer to this and it may even involve some judicial intervention to clarify the situation. Strictly speaking if Universities are public authorities for the purposes of the INSPIRE Regulations then they are already not complying with INSPIRE as they have not established a complaints procedure to deal with questions over INSPIRE data provision as required by the Regulations. So currently a “NO” but with the uncertainty surrounding public task it could be a complicated or impossible job to enforce this regulation at present. So this will have to be a wait and see area.

Meeting with EDINA and DCC staff in Edinburgh

I was fortunate enough to have a meeting with some people from EDINA and the DCC in Edinburgh on Wednesday. The aim of the meeting was to get some input and advice from some experts on the ideas I have for a spatial data management best practice report.  So a big  thank you to Martin Donnelly of the Digital Curation Centre (DCC), James Reid, Stuart McDonald, Chris Higgins and Michael Koutroumpas from EDINA.

I had a long 7 hour train journey from Aberystwyth so my apologies for the overdose of PowerPoint slides that I had time to create before the meeting. It was extremely helpful to talk to experienced and knowledgeable  people about the direction the report, which is one of our outputs from the IGIBS project. My background in environmental science leaves a few significant gaps in my knowledge and, as Chris put it, “a sanity check” on my work was well worth the time needed to attend the meeting. I even had the opportunity for an evening walk on Arthur’s Seat and a lunch hour looking around Edinburgh as a bonus.

Some of the key advice from the meeting centered around the following; INSPIRE and how it will or wont impact on Universities,  insights into the not so obvious but very significant benefits of writing a data management plan and where it fits into good data management, some great pointers to other studies and sources of information that will feed into the report, the need to make the report easily accessible to its audience and some great institutional  case study examples from Australian through Californian to British Universities.

Another theme that emerged from the discussion was how INSPIRE and the need for good data management can be viewed as a threat but it is also a great opportunity for academic staff to gain easier access to the ever increasing amounts of spatial data being created around the Globe. A viewpoint that will help to make the report more appealing to time starved researchers.

We also had talk of semantics and just what do you call a spatial data infrastructure (if you don’t want to use SDI). It was suggested that UK Location has moved towards Location Information Infrastructure as a way of making an SDI label more intelligible to the uninitiated. I found this much more enlightening and useful that the recent update from UK Location on “Data Things” and abstracted “Data Objects”  but a few hours of digestion may make this a little more understandable to my irretrievably ecologically orientated mind.  It reminded me of some reading I had done about old Norse governance and how their aassembly was called the “Thing” and met in the “Thingstead”.  I remember thinking that they didn’t have a proper word for it so just called it the “Thing” but I guess that just shows how language develops over time and maybe we can look back to SDI in a few years with the benefit of a really useful label for it, whatever that may be.

As a result of the meeting I am re writing some sections I had drafted and adding some new summary sheets for subsections of the intended audience and more importantly I don’t feel like my original thinking was miles off the mark, just a bit  under-informed and lacking some focus.  So creating the rest of the report will also be made a little easier once I have digested the new material I have been pointed towards.

So thank you once more gentlemen and I look forward to meeting you again if the occasion arises.