How do you solve a problem like Geo? Highlights from the JISC Geo Event and Discussions

It’s been a few weeks since the JISC Geo Tech & Tools Product Launch event at London so we thought it was time we updated you on some of the follow up activities…

On the second day of the JISC Geospatial Event in London, we had two sessions to gather around tables (and/or move between them) and discuss some questions around the themes emerging from the JISC Geo projects. It followed on from the previous days thoughts “to figure out which products are going to help catalyse the spatial revolution in .AC.UKs”, but this session involved discussion that looked out wider than the presented products.

In session 1 discussions included:

For session 2 themes running through the projects and 6 stages/ways of working with data were identified and discussed.

We would love to hear your thoughts on the discussions – leave your comments on any of the blog posts linked to here or add your comments on any of these topics here. If you were part of these discussions and think an important point didn’t get noted down, do add it as a comment as well.

We will be sharing more materials from the JISC Geo End of Programme events early in 2012 but in the meantime here are some video highlights – also available from our new Podcast stream [Click on subscribe via iTunes] – to enjoy:

JISC Geo Timelapse

View the whole of the first day of the JISC Geo in just 1 minute:  JISC_Geo_Launch_Event_Timelapse

Highlights from the JISC Geo Show & Tell 

Hear about the best projects at the Show & Tell events where all 12 JISC Geo projects showed off their work along with some guest exhibitors: Highlights from the JISC Geo Show & Tell


IGIBS Followon and use of Underspend

Its a bit early to be making predictions about how IGIBS might evolve, but a recent presentation to the EDINA geoteam followed by some discussion indicated some of the possibilities.

  • The WMS Factory Tool.  With the simple but effective styling capability that Michael Koutroumpas engineered, I think we have a prototype thats not too far off a production strength tool.  There are loads of scenarios where its valuable to have access to a tool that makes it easy to see your “non-interoperable” data alongside the growing number of INSPIRE View Services (read WMS) from public authorities across Europe going online.  So top of my list is improving this tools styling capability.
  • Associated with this would be better understanding of necessary data publication infrastructure, eg, making it easy to use the other OGC Web Services.  Something like the GEOSS Service Factory ideas emerging from the EuroGEOSS project.  I think there is a real demand for tools to make it easy to use the OGC standards.
  • In the immediate future, I think its likely that the IGIBS team will do some promotion of the project outputs, eg:
    • presenting the project at relevant events, eg, Association GI Laboratories Europe conference, OGC Technical Committee meetings.  This might cost as little as £500 depending on where the event is.
    • use of social media to promote both the WMS Factory Tool and the report on “Best Practice Interaction with the UK Academic Spatial Data Infrastructure”.  This too could cost as little as an additional £500.
  • The latter report is worthy of a lot more investment.  A major output from this project, possibly the single most important output, is the increase in use of UK academic SDI services within the Institute of Geography and Earth Science (IGES) at Aberystwyth University.  IGES is acting as an exemplar for best practice research data management around geospatial data, the department is actively building on the IGIBS work and it will be interesting to see how it develops and if other departments in other institutions see the benefit and start to emulate what Aberystwyth is doing.  More work promoting Steve Walsh’s report would help.

JISC Geo Tools event – Recommendations

The STEEV project led the discussion on two tasks at the #jiscGEO breakout sessions (as part of the JISC Geo Tools launch on Nov. 28 & 29).

The first task at table 6 was to come up with a recommendation about how spatial and temporal analysis can enhance research. Using the example of digitised boundaries for temporal spatial research the group discussed the unavailability of historic content (bearing in mind the volatile nature (in political terms) of boundaries!). Discussion also centred around the new INSPIRE directive and how compliant spatial datasets must have a temporal component (i.e. a start date). Views on a variety of spatio-temporal analytical approaches and utilities were exchanged – this led to the formulation of two recommendations, namely:

  • An audit is required of spatio-tremporal tools, utilities, procedures and techniques used within a research space
  • For the purposes of exchange and integration, the creation of an Open Geospatial Consortium (OGC) XML-based spatio-temporal data standard

Many thanks to Mia Ridge (Open University), Neil Jakeman (Kings College London), Andrew Bradley (University of Leicester), Tom Ensom (UK Data Archive, University of Essex), Richard Fry (University of Glamorgan), Scott Orford (Cardiff University), Andrew Newton (University of Huddersfield), Jasper Tredgold (ILRT, University of Bristol), Kate Byrne (University of Edinburgh) for articulating said recommendations.

The second task at table 5 was to discuss and come up with recommendations about how to fully exploit spatial analysis within research.

Much of this discussion concentrated on spatial literacy as a means to both prepare and engage the student or researcher considering undergoing spatial analysis in an educational setting. The ubiquity of modern web mapping utilities, geo-tools, open geo-browsers means that it is easy to represent spatially a whole range of data. However whether the representation is accurate, makes sense or is reliable is another matter. Thus in order to ensure that spatial analysis is robust, can bear scrutiny, is accurate and understandable the group came up with the following recommendations, namely:

  • The establishment of a (JISC) spatial interest group (comprising a whole range of stakeholders) that can advise and critique on spatial analysis methods, applications, documentation, open materials and courses, and provide expertise. This may be national in remit.
  • To scope an ‘analytical framework’ robust enough to be cross-disciplinary, which would make explicit spatial representation for the purposes of interpretation, make explicit context be it physical, social, temporal. In addition this framework should be adaptable to work from the generic to the domain specific research scenario, use non-technical jargon and be critical in its approach to include both positive and negative case studies.

Many thanks to Martha LeGess (LaMa studio), Chris Bailey (ILRT, University of Bristol), Patricia Carbajales (Stanford University), Conor Smyth (EDINA, University of Edinburgh), Alexander Hirschfield (University of Huddersfield) for articulating said recommendations.

Space and Time in the Digital Humanities Workshop, hosted by NeDiMAH and JISC LiveBlog

Today we are liveblogging from the Space and Time in the Digital Humanities Workshop, hosted by NeDiMAH and JISC which is taking place in London and follows on (in the same venue) from the JISC Geo meeting earlier in the week. As usual this is a liveblog so all of the same caveats as normal apply about errors and omissions. The hashtag for today is: #spacetimewg

Leif Isaksen (also of the PELAGIOS project) is introducing us to the day by saying that Greenwich is the place where space and particularly time is measured from. If you go out into Greenwich you will see a big laser in the sky and that’s the Greenwich Meridian. And if you look at Ptolomy’s Greek Parallels Intercept you will see that London is also marked there. Ptolomy’s regular grid was the first to start looking at time in

NeDiMAH is a new funded network for Digital Methods in Humanities and Arts from the European Science Foundation with objectives to create a map visualizing the use of digital research across Europe; an ontology of digital research methods; a collaborative interactive online forum for the European community of practioners active in the area. There are also a number of working groups.

Today’s event is arranged by Working Group 1: Space & Time coordinated by Jens Andresen, Shawn Day, Leif Isaksen, Eero Hyvonen, Eetu Makela, There will be four workshops over four years and you can find out more about these on our new website:

The format for today will be four sessions on place, period, event, summary. We will have 30 minutes of position papers, then 30 minutes group discussions then 30 minutes of general discussion for each topic.

Panel Session on ‘Places’

What are Places? – Humphrey Southall, University of Portsmouth/Great British Historical GIS Project

My basic position paper is this slide: a table of different kinds of geographical entity and the role of gazeteers.

There’s a certain way in which places coincide with geographic features. In London a lot of our places have names like Royal Standard, Sun in the Sands, etc. So where does that come from? Well initially it’s a pub. What else is it? It’s in some sense a roundabout, a rotary. And again it’s a place, it’s a place on bus timetables. And it’s a conservation area – a names polygon with clearly defined boundary.

My second example is the Nag’s Head in Islington. A pub initially. Now an amusement arcade. But in Wikipedia the place is still there even without the pub. It’s a bus timetable location again. It’s also a town centre area: it’s a bounded polygon.

Of all these types of places the Elephant and Castle is the best known example but there are very many.

So if we look at another example. So we look at a map from the Guardian earlier this year of England’s most deprived areas. These are output areas from research though they are not
Jaywick (in Essex) was found to be the most deprived area in Britain. A discussion broke out in the comments about the second most deprived place, Breckfield. A commenter says that these areas do not exist, he is rebuffed by another who gives evidance: it’s a pub, its a centre, etc. It’s about a whole set of features.

Linda Hill writes about place with the example of Gruinard as a place. I disagree. It’s a series of features but in this example it is not social but geographical features. Historically this is how place is defined. Groome, for instance, describes Gruinard as “a bay, an island, a stream” etc. So we need to differentiate between geographical features and administrative areas or places. If we are interested in history and cultural research it is about administrative units not geographic features.

From Place to Facts – Franco Niccoletti

Lets go back to place. This is a very ambiguous term. It is an “extent in space (in the pure sense of physics)” and/or an abstract concept. We are not so interested in places but we are interested in facts and stories. We are interested in seeing which facts, objects and stories happen in a particular place. Relationships between places, between events, between objects are important to us.

There are some features and challenges of “place”. There is some fuzziness. Sometimes it is difficult to draw the borders of some place. Spatial entities, objects occupying space and located in places are affected by spatial relations – mereology, topology, is the place where an object is located part of it?

The expansion of the concept of place and the concept of appellation to facts is important here. A place X is identified by an appellation a(X) – which gives you an absolute reference, a relative to an overarching system. Or a relative reference, a relatvie to some local system. And most commonly by Place-name. We reason on appellations. We need to relate a(X) with a(Y) etc. to relate facts – what happened in the same place. However appellations have their own issues. They are imprecise. They are time-dependent – the place my change or the name of place may change over time. And they are also space dependent as there may be different appellations for the same place or the same appellations for different places. They are language and culturally biased. So is the fuzziness in appellations or places?

And finally we have Gazeteers and Thesauri. We are all familiar with these. Gazeteers are a list of appellations trying to normalize them, i.e. referring appellations to one another or to some reference appellation system (co-ordinates). But gazeteers do not take into account space, time or cultural variability and does normalising appellations influence our concept of places? Is there any way of dealing with needed extensions?

Place Reference Systems – Simon Scheider

Simon just finished his PhD at the University of Munst where I am based in the Semantic Interoperabiity Lab. We want to make information work together in terms of syntax and semantics. I will talk about an idea I had with Krzysztof Janowicz about place reference systems. I will talk about what reference systems are to explain what a place reference system could be.

Many of you will be aware of semantic technology, ontologies etc. Our opinion is that ontologies are very useful to do this in a certain way as they constrain options. But they have one serious problem which is that they do not account for the problem of reference. In philosophy this is about how to explain what symbols stand for. Reference systems in contrast account for this problem of reference in a practical way. Spatial reference systems do this well. There are formal theories – cartesian coordinates is a formal theory in a certain way. The theory of the primatives in the theory, the coordinates, are fixed in convention. There are certain operations that allow you to find a location from a certain location. There are other reference systems – calendars are temporal reference systems. We can reidentify those points in time using these reference systems. So how do we in general invent these reference systems. And a place reference system is desperately needed.

Places are not the same thing as locations. Every place has a location but they are not identical concepts. This is very clear from the previous speakers…

Comment from Humphrey: But not all places have a location…

Well where is the place of medicine in the Battle of Trafalgar. Is it the location on the battlefield where treatment took place. Or is it, say, the HMS Victory – but that is, at this time, in Portsmouth. It is a tricky issue.

So, the question for us is how do we generate these place reference systems.

There is much more to say on this – I wrote my PhD on this – but there is also a paper I would direct you to here: Place as media of containment by Simon Scheider & Krzysztof Janowicz [PDF can be downloaded here]

We are now breaking into discussion groups on Place.

The groups are reporting back:

Group 1 discussed the ideas of concept based systems of place. And also discussed patterns of movement and how that relates to these concepts. Behind all of that was the issue of addressing the needs of the specific user. It is more about customising the user interface to the user’s specific needs. That was the overall subject we were circling around on.

Comments from speakers:

Simon: on the user centered view. If you look at the issue of ontologies and semantics the user is always important. You should never leave this out of ontology or use of reference systems. One may come up with different ontologies, different reference systems dependent on the use of these systems.

Humphrey: In some ways what was key to my presentation was the idea of places in consciousness and discourse. Consciousness is very individual and not so helpful. Discourse is about sharing with others. So we cannot focus too narrowly on users, you need to focus on communities of use perhaps.

Franco: I may either totally agree or totally disagree with the idea of users! Which users you mean? Users of today? Users of tomorrow? Goals perhaps a better idea: what is the purpose of using this place, what do we want to achieve. Users can prioritise current use unhelpfully. We want to think about intended use, community of use and we can use shorthand of users. If on the contrary we mean lets investigate place and time for archeologists say then I would totally disagree.

Shawn Day: These are some great issues to raise! It’s really important to think these things through.

Laura: Thinking too much about uses can be problematic. Thinking about travelling say. We think about user travelling… if I’m a younger person I want discos and pubs. I’m travelling with children, so maybe I need hotels… these variant needs are important

Group 2: We felt that the context is really important. Context – global vs local for instance is important. Scale is very important. And in terms of users we need to think about the information we want to provide. The problem is how to present that information.

Comments from speakers:

Simon: To address the user issue further. There is a difference between user centeredness and goal centeredness. We all have goals and we can share them of course. We can create ontologies that are widely usable. Goals and objectives can be shared.

Humphrey: The comment of talking about problems and not solutions… I was at a three day workshop in Seattle not unlike today. The problem was that that meeting does not seem to be leading to anything (other than a book). Perhaps a third of the people there agreed that linked data gazeteers were the way forward, the others didn’t know what to do with it. The PELAGIOS workshop do show the way forward in this area. There is work going on that needs some expository stuff.

Eero Hyvonen: I’m a computer scientist and from that perspective I want to give you some use cases. We have ontologies available but what we are looking for from those doing cultural heritage side: what are the problems you want to solve? Like those use cases about travelling etc. When we have those goals, those use cases, we can find issues and use those to find appropriate methods.

Leif: That’s certainly something we can talk about later today.

Group 3 discussed whether or not we are looking for a global solution or some local solutions for a problem. For instance archeological data structures of local grids, local reference systems could be referenced to a universal reference system on a use case or type basis might be better than trying to create everything. So if you have a book perhaps it uses a book system that refers back to a global system – maybe a way to deal with groups of things rather than a universal system.

Simon: That’s a very concrete problem. This is a problem we have in archeology, also in history etc. We have local very hard to understand systems. We need to understand and translate them back to other systems to understand thing. We need to think about what would be in that general reference system to solve this. A solution should also be triggered by practical questions and this is a very good one.

Humphrey: Is this about spatial coordinate systems or something else? If you start from an archeological systems you would think of location as fundamental but names much more fuzzy. If you define things by a name but without a location that can still be quite a solid thing. For instance if you take the example of Camelot. No-one knows where it is but it is a very clear, very concrete entity. And it is a geographic entity. But we do not know the location and in history this type of entity is not unusual.

Franco: Well possibly Camelot is not a historical place but all the same my presentation was about place as where things happen rather than location and Camelot would fit into this system. Space and place are strongly linked ideas but on the other hand using the same framework for very different content leads to very poor information. So for example libraries – general libraries are valuable but specialist libraries are also essential.

Comment: What is most useful gets used the most. Cross referencing systems can be dangerous in some sense. There is a Darwinian element here – what happens with systems that are not as useful.

Simon: Use cases are helpful. You can start doing something, see how it is used and understand the use case that way.

Group 4: We ran a little out of time discussing the issues. One thing that is worthwhile to add: we talked a lot about pragmatic approaches. We discussed that place is much more a social construct than a geographical thing so how do you establish equivelance. And when does it not matter to have equivalence. We also thought about PELAGIOS’ approach – mapping systems against a baseline gazeteer. People can annotate their own data then find connections between data. Solve the common denominator issue that enables you to achieve interlinking.

Humphrey: Again I’m not sure what our collective baseline. In some ways my presentation was a grossly simplified critique of gazeteers. A lot of national digital gazeteer providers are very much thinking about features but time creates all sorts of muddles. Wikipedia and DBPedia have a lot of knowledge in them but they are scary in terms of features. I’m not sure GeoNames are much better.

Shawn: We have three real issues we really need to address and go to. The issue of a variety of case studies is a good idea but how do we deal with the unknown user and be as strong as we want it to be. And how do we deal with abstracting and not abtracting too far. There is a practical way of doing this, of thinking of which questions we want to answer. The Stanford folks for instance have been talking about how to deal with users that need simple tools to work with, who need to quickly understand the issues. People have been grappling with this for centuries so I don’t think we will solve this today but we will have some great discussion.

Panel Session on ‘Periods’

History in Context Ceri Binding, Hypermedia Research Unit, University of Glamorgan

Although we have three distinct sessions today it’s impossible to entirely separate these concepts. Objects connect to events, events connect to places and they connect to periods.

Simple attribute assignments contain a lot of complexity and implicit semantics and lack flexibility. We need to be able to document the statements we are making and the provenance of those statements.
We have been using the CIDOC Conceptual Reference Model to create an event based nmodel rather than just attaching a date to an object.

Periodization lets us subdivide time and cronology lets us order ad understand events. We also want to classify a period in some way – monarchies, style, etc.

In early periodization & chronology we have Erastothees was a 3rd century BC Greek Scohlar who established the first Chronographia of Greek history. Ussher’s Chronology looked at ordering events. His work was rather overshadowed by the fact that he included – as others at this time did – an exact date and time for the “creation” 23rd October 4004 BC, noon. That sort of discredited his work but it can still be useful. If you see a passage from the Common English Bible – Luke 3 (referring to John the Baptist) gives a reference to a date in the rulership of a Roman emperor that we can cross reference with Roman record keeping to anchor events in a particular time.

So when we model periods with CIDOC CRM we do not need to fix the exact time span but we can connect relative timespans. We can assign attributes to an event that helps us explain where this assertion is made – multiple people can make such assertions and give multiple and conflicting definitions for a particular period of time. It’s important to have that multiple vocality in time periods.

We also need to understand period relationships – A is before B and B is before C, say. Putting periods in relative order is more important than having exact dates attached. So for instance we took English Heritage’s SKOS concepts and connected them to CIDOC CRM entities to build up conceptual entities.

We have also created a simple tool for looking at dates and time periods available at which we showed at the first PELAGIOS meeting earlier this year.

[APOLOGIES. Our Wifi connection died here and notes from the second speaker were lost. We will try to reconstruct these later today.]

Glauco Mantegari, University of Milano-Bicocca, Italy

[Notes destroyed by wifi issue so will follow]

Use of Periods in British Museum Documentation – Jonathan Whitson-Cloud

Why would we have a thesaurus? Well the British Museum’s purpose is world peace – the concept was that better understanding leads to equivelance and peace.

We have a more pragmatic set of reasons for needing a thesaurus. We have 1449 terms but not everything fits together oerfectly. We have all the usual partso o thesaurus terms. We try ot to use rlated terms for periods. We only use period/culture field to indicate production period. All of our use is very object orientated. We do not ave associations to other references. We have lots of ways to record uncertainty and fuzzy periods.

We don’t call it period but “material culture”. We really think of it a a cultural label rather than a time or place – we record that elsewhere. This information always interacts with other things. We do not include date but most commonly we add context through material, authority/regnal dates, production/person(s), school, state, ethnic group, find-spot, associated name or place or event. And some departments refuse to use periods at all.

Period is always part of a wider set, it always interacts with other information on the page.

Inference can be an issue – it’s always appealing to fill in as much information as possible. If a definition of a period changes you have to update lots of record. So we allow conflict. We have periods and we have dates and if they clash we allow that, it’s the reality of what we currently know.

What we like about periods is that they are conceptually simple – very good for a lot of our audiences.

We want our thesaurus to speak to others thesauri. We have made our British Museum data available as a SPARQL endpoint (here: Thesaurus could be extracted or referred to from that. We inlcude Periods which are freely share upon request and you could embed references (URIs) to other thesurus in BM etc. And we are keen to engage with projects like PELAGIOS.

And we are now discussing those presentations in our groups (and having some lunch).

Group 1 had some questions: how can you tell that the same time period is alike? Use cultural artefacts and cultural opinion – combining three approaches. how about using temporal techniques to define place. we also talked about simplifying models, how you can keep rich complex information but document it pragmatically.

Group 2: it is clear that objects are important. And objects connect to taxonomies and their own systems of organising. Context is crucial, both the object context and the museum context. These are concepts about grouping and differentiating between objects and time concepts. Some data is fairly fixed – archeological layers etc. We also talked about the user dimension. Library catalogues can be useful for users but museum catalogues have not generally been designed for use by users in the same way.

Group 3: Guaco talked about the vagueness of different granularities. We talked about different scales of ontologies etc. this morning. Is there a system to define the granularity of your ontology or your reference system? So that we can make better sense of granularity. And touching on Jonathan’s talk terminology is crucial. Just identifying key terminologies, taxonomies, ontologies in a given domain is critical. You need to be able to plug in some sort of controlled vocabulary or ontology etc. For me a big win for developing thesauri in RDF would be to find ways to cross reference them. We know that there is ambiguity but that’s where the Semantic Web vision does start to help understand those relationships and articulate the various meanings and systems in use. But I’m still struggling with where we connect spatial and temporal together. We can talk about the Romans for instance. We have a period but that’s only relevant to a certain cultural history in England. It means different things to people from other parts of Europe say. We need better spatial boundaries on periods for this sort of reason.

??: We had a litte chat after the panel session and we raised the issue of communality, communal understanding of concepts. And also about facilitating different types of users to make use of it. And on the technological side the claim was that this is far more advanced on spatial issue. But the panel table felt like the spatial community had some way to go on the conceptual level. So, comments on communality…

Jonathan: Communality would be nice but I think we it is more likely that we will share some things, we will define some things differently. And that should be fine. We need to assert useful knowledge. It’s useful to think of ourselves as a community. But we need to be aware of where we specialise both as humans and in terms of data.

Ceri: I think a baseline about a period, about a place. I think that is important to understanding, trusting and using data. An esperanto for our data here. Without that there is less chance of using data beyond what it was originally intended for. Interoperability is the promise of the semantic web but to get out of our silos we need to find those ways to trust and share. In terms of looking at periods and whether two are the same we need assertions and meaning not just labels when we compare these things.

Guaco: I agree with you. Of course the definition of periods and even the more general concept of time is very difficult to agree on. On a practical level and thinking through the practices of practices in some specific domains we do have some basic common assumptions on what period is, on what an event is. Perhaps it’s not formalised as models. But archeologists for example have a long tradition of defining that concept. Of course this new technology makes it possible to take care and represent some possible differences in meanings and to eventually let machines automatically understand different approaches. But I think too that basic agreements on concepts is needed otherwise it becomes very difficult to do anything. Thinking to ontologies, triples, RDF, anything… you need to attribute so these new technologies can help us to show the provenance and attributions for what you are asserting. This is a core concept in semantic web use anyway. If we have completely integrated information we need to know how to connect information, how to understand the reliability of some data versus another.

??: So we have an issue here. We have authority as a way to get communal understanding. Is this an accepted road to communal understanding?

Comment: Wikipedia has given us a new idea of community and authority. Encyclopedia Brittanica was top down but Wikipedia is about the community expanding and improving knowledge from the bottom up.

Guaco: Well Wikipedia as a model is not in contradiction to approaches discussed. Perhaps it is useful to consider periodisation beyond the community of experts. Often they have very specialised view but general users are also interested in this information and provide different views on the same thing. Important to interfaces and how users perceive these and use these terms in actual systems.

Comment: For example on Rotton Tomatoes film review sites they have a top critics rating a general critics rating and a user critics rating. And those might conflict heavily. One might argue that perception of some cultural periods might also share that conflict of opinions

Ceri: Some of those social softwares are very interesting and offer good opportunities. What I want is more than one point of view. Wikipedia doesn’t really solve that – where there is controversy the page represents compromise rather than multiple vocalities but yes, it would be good to have new voices.

Jonathan: There is a kind of race between Google and Wikipedia to deliver this kind of thing. Google are trying to make semantic search engine. They have quality but not quantity but they are working on it. The tools will come. More than one view is great though. Linked data really allows

Comment: You talked about attribution and provenance earlier. The way people use that data requires that provenance especially if the broader community is contributing here. I am interested in how we keep attribution metadata even when mashed up into new services. How do we make sure that attribution is retained in those end points.

Leif: It’s a bit off-topic… I know that attribution is very desirable. But I wonder whether when you take information and create secondary products you need to always attribute – the difference between a coffee table book and scholarly materials is significant in terms of attribution and citations. But you must make it easy to attribute data.

Eero: Google brought, a year ago, freebase, and then they are implementing that in the heart of the search engine and making use of this. One main issue there versus DBPedia. Freebase is created also by the public and selected by editors although volunteers, the main issue is to make it correct, always editors to try and keep it correct. Interesting to note. And another point to note. They do not use triple store in Freebase but a quatro store and that is to account for attribution to help with quality.

Nicola: There is a cultural consideration around attribution. One has to be careful not to use attribution in such a way that it implies credibility to a secondary use of the data. That is an issue in some other scholarly data contexts. If the original source data author is visibly credited it could imply endorsement and that could be create issues around sharing data.

Simon: I’m not sure that authority is the solution to this. It may help keep work to a certain standard but it does not address the heterogeneity of the data.

Panel Session on ‘Events’

Eero is introducing this session by saying that events are even more complicated than places and periods in some ways. We have so many aspects that are interested in the human sense of events. And from an interoperability point of view they are very complex to understand. So we have three different perspectives here.

Once Upon a Time: Space, time and event in modern storytelling – Laura Spinsanti, Spatial Data Infrastructure Unit, Insitute for Environment and Susainability, Joint Research Centre, European Commission

I’m not from the historical domain so I apologise in advance if I say anything silly about the past. So if we take “Once upon a time in a little cottage in the forest there was a little girl names Andrea…” we have a time and a place there. We then read about a dragon the forest – something interesting is going on. This is an event.

We have all sorts of new ways to tell stories. We have microblogging – people write stories of their everyday lives on Twitter. So for instance we have a story, a time and some indication of place with this brief stories.

HC Vent – Here by Dragons. From space to place and back – ancient maps tell a story, about important places (church, castle, coaching inns) and dangerous places (mermaid). GIS describes static reality – now it is perhaps more dynamic but it is a reality far away from what people can use, perhaps apart from scientists. And then we have neo-geography – the usable geographic information to describe reality. In some ways this takes us back to place and the activities we are doing in a place. A map can tell me about hills and rivers on a Leanardo De Vinci map. If I look at a proper modern geographic scientific map I need lots of information and skills to read that. And meanwhile through Twitter, through other social mapping people are creating these neo geographies for themselves.

Looking to time. Time is not subjective – there are clocks everywhere – and yet time is no more beaten by events – we live globally and there are many events at the same time. Time is also our modern obsession. The promise of neo geography is the idea that we can update our sense of place over time.

In a dictionary events are defined as “something that occurs in a certain place during a particular interval of time” but we are talking about the observed world and we are therefore talking about something important when we see something as an event.

And we have various sensing tools – EO sensors, VGI sensing which is social and participatory (and problematic). These sensors create a huge amounts of data – in our VGI project we collect from 6000 to 30,000 tweets per hour, noSQl DB, the cloud, distributed computing. We want to mine that data, we want the context and semantics around this data. And we have to deal with concerns about the importance of the event for the community – when we use data from social media we have a partial snapshot of that community and therefore a partial view of the imporatance and activity around that event.

So, in conclusion…

Is history written by the vistors? Well it is now more participatory, social, and more gender balanced perhaps if only in terms of perspective, from the bottom up.

But there is lots of challenge here. Imprecise, vague and fuzzy methods to use these new data sets. Time varying information needs to become a Standard Time representation; big data/scalability – a new scientific challenge; credibility – authority versus community. People are talking about areas they understand very well so they have and bring lots of their own context

Comment: I have a question… I kept thinking that you would show us the character in your story. why not? In normal narrative terms you would focus on the character.

Laura: I focus on space, time, and event but I could have…

Deducing Event Chronology from Narrative – Oyvind Eide

Holmen/Ore calculations work. We looked at documentation dated 1660 about a church being built, and another dated 1690 about it being built, and one from 1711 an account of the construction taking at least 6 years. Then an account of 1984 saying that a coin from a particular rein was found in the foundations. The idea is to reduce uncertainty. That’s fine for time…

Can we make a similar tool for spatial analysis? It is more complicated to move this into the spatial dimension. If you know something takes place in a larger area and that there are broader bounds for related events there may also be ways to reduce uncertainty. Based on my PhD project where I am modelling verbal and map-based expressions of geographical information I am looking at what is there in a textual description. You have to have leeway on connecting points when you look at, say, point A being a mile south of point B etc. and your true possibility room gets worse and worse as you expand the description. Is it possible to make a geographic dimension like that and actually reduce it down to make sense of the places by seeing that certain possibilities are not possible.

Narratology and how events are described – e.g. Bakhtin and his notion of the Chronotope – can help us understand the temporal and spatial relationships that are artistically expressed in literature. Maybe here we can understand where space time and narrative meet.

Nicola: How do you deal with the fact that the narratives you are comparing may actually be based upon each other – successive accounts building on those before. If you use those to verify each other that will surely be an issue in using this approach?

Oyvind Eide – It is important in the use of the system but I’m not sure about in design of the system. When archeaologists look at sources to evaualate a system so understanding sources here is very important in doing that

Comment: Have you done anything with regional connection calculus.

Simon: There are people working on spatial information and spatial relationships and they are trying to come up with a theory on this.

Laura: You can try to use contact information. If there is a description of a building, perhaps a building cannot be constructed in any place. Perhaps there is a building on a river so that you can exclude some possible locations say.

Oyvind Eide : For various reasons I try not to use pre-existing maps as many problems would be solved to quickly as I want to understand the uncertainty here.

Eero: I think the reasoning stuff is very important here. When we know that counties split and merge but not the population or coverage it is possible to reason with these sorts of approaches.

What is an event? – Ryan Shaw

I am an information scientist working with lots of historians, but a different sort of historians than most of those here – those working on recent twentieth century historians looking at radicalism and civil rights. I guess I see a difference between scheduled events and historical events that are more retropectively defined.

So I think we’ve actually gotten pretty good at modelling space and time and then we abstract these. But when we talk about space and time we actually want to model events and their possible relations. So what is an event shaped like. So according to Wikipedia events are shaped like a box – this is the source information for DBPedia, Freebase etc. So we have this near box with labels, participants, location, dates, etc. This is better than nothing but I think we can do better than this. But events are not necessarily blocks… maybe the Tetris form of events, the slotting together of many events.

Events do not have a specific shape, they shift and they are this mix that Humphrey talks about of conciousness and discourse. Our conciousness divides experience and stories into events that is to some extent culturally independent. But that same psychology is broken up further by language. Watching a movie, playing a game, those also trigger that breaking up of events as if we took part in these experiences.

So how people formulate event models in their mind follow dimensions of time; space; protagonists; causality; intentionality (Zwaan, Langston…) and how you store those events in your mind shifts depending on how you read a narrative. Many of these experiements have been with simplified texts of fairy stories. But looking at more complex events at any point of history you can tell a story at different granular levels. A story in days may not fill in detail of an account of an event by year. But there is a relationship between levels – you can tell the story of the 17th century can be told in centuries or in decades say.

There is an interesting relationship with place here. If we were planning a trip to the west coast we might say that we should visit San Francisco. But your itinerary between two different trips to the west coast may not coincide at all. So a Flickr map of San Francisco shows that tourists take totally different images to residents – there are two San Franciscos. In fact there are thousands. In events the same is true – there are thousands of different Arab Springs. There are a number of ways that each story constructs the same events differently.

So another example here is the Neighborhood project – Matt Chisholm & Ross Cohen ( Some stories have common paths and key tracks or recommended routes around an event occurs. So we can see a clear block like identity but that is built up over time rather than being inherantly true. Historians often have clear information about when things happen but they are often interested in disrupting that clear path of blocks. For example if you see a review of Blood Lands by Timothy Snyder there is consideration of how one can see a more broad view of Europe between Hitler and Stalin. When we build those structured paths and chronologies in our infrastructure of teaching history then we abstract those events.

What we are striving for is models of events where we can abstract between different granularities of an event. Through a shared level. Then a more nuanced view of a concensus pattern (e.g. the British WWII, the Japanese WWII). Then at the next level there is the individual narrative to compare different events. And what is interesting is how we can extract shared labels from these individual narratives.

Question: To what extent are you asbstracting the spatial out of your definition of events?

Ryan: I think that you can think about those different levels. If at the shared label level of World War II it is near impossible to make that terribly spatial. But at the individual narrative level that is going to be much more specific in terms of place. So there is a trade off between richly modeling events by location and protagonists, and abstracting away to the level of labels etc.

Simon: I think the granularity is the issue in terms of space as well as times. So is your approach a practical solution to model events?

Ryan: I can make it a little more concrete for the use case I am interested in, the history of the civil rights movement. So I have accounts of the movement. You can see the evolution over time of the scholarly accoun tof the movement and you can see how different sort of individuals record the movement. I am keen to identify local events and then aggregate ways for sharing models, shared events, etc.

And we are now going on to discuss this panel over coffee…

OK we have returned refreshed… the final portion of the day is:

Open Forum Discussion on Space Time – Lief Isaksen is chairing this

The hope is to identify common themes. To ask about anything important we’ve missed. And to discuss how NeDiMAH takes forward ontology here.


Methods and Technologies and Infrastructure are what we want to think about first. Both current best practice and current tools. And also what are the things that we need or could improve?

Methods – Current

Georeference is a good way to look at place. The Gazeteer is a footprint and is based on spatial reference but there needs to be an independent place reference system. That’s a theoretical issue.

There are a lot of things going on in building digital gazeteer. But these tend to be topographic mapping or they are crowdsourced but for use in historical projects. But there is a large potential for retro conversions of scholarly work like the Survey of English Place names, spatial authority lists etc. But those are expensive to do. But a good gazeteer is a big gazeteer but getting up to big properly backgrounded content is expensive and difficult. We need to consider that size isn’t everything. And we need to retroconvert historical and scholarly materials.

There are issues around clarity and IPR issues.

Need a vocabulary to link places to other places. We need other techniques not just gazeteers here. Place ontology.

We also need an alignnment of KOS.

We have gazeteers but there is more to ontology than gazeteers. We need a better formal theory and ontology of place.

Un-GIS – we have these concepts of locations and places that we can work with away from GIS.

We need a temporal GIS – a system that allows me to work with temporal boundaries or events as they change a place over time.
Response: there are some systems for these but they are not commercial and they do not deal with fuzziness or conceptual aspects. Secondo.
Comment back: I’m talking about something that does let me handle those concepts, those granularities.

If we are talking about software we use PostGres and PostGIS and you can use that for all that sort of data. It’s not a GIS but a relational database. A GIS is not the way to represent this stuff.
Response: you need something visual to work with that.

I would recommend looking at existing gaming and modelling technologies. Gaming engines perhaps useful here.

High quality metadata from mapping agencies, and for those to exist across borders. We need vector maps and vector quality. Especially for Scandinavia. And crucial from moving from name to place on a map.

Finnish land survey publishing everything open source btw.

There was talk of PostGres and PostGIS and you can in fact build visual elements on that through interfaces so a clear set of demands or requirements are the key part of making those technologies work here.

Validation of existing crowd sourced materials could be used to move towards a place of having reliable data that builds on that existing material.

Geo-parsing needs are very specific for historical materials and needs improvement. There is such an important element of context in parsing historical materials since places change over time and there is not only the need to create specific parsers for specific materials but also to have a way to understand the context of how those geoparsers handle a given placename at the time of that material’s creation.

Is there a need for temporal parsers as well?

We have frame semantic parsers.

We also have Freebase, dbpedia, geonames, pleiades etc. available to make use of.

We need to develop or improve parsers.

We need event parsers.

We need improved event gazetters

We need good event ontology

We do have CIDOC CRM (-EH)

The Linked Library Data (W3C) resources have some real usefulness as well.

Also SKOS.

And… if anything else occurs to you do email or contact or comment in the direction of NiDeMAH.

NiDeMAH is trying to think about formulating an ontology of methods. They did some work on this a few years ago on Digital Humanities but DH has expanded significantly since then.

I’ve been finding it useful to think about a cookbook approach – here is a method, here’s what it is intended to use, here are related methods, here’s an example, etc. And perhaps a way to see if that method is good or if you should look elsewhere. Not giving full information but full pointers for intelligent tourists can find out where to find out more.

Shawn: It’s a sensible process but it is a long term process. We’ve identified so many different approaches here. In the perfect world I’d love to send everyone home with homework. We are such a diverse group here and we may think that we are one group here but are we even speaking the same language? We have to see how people do use these things so we can find what that one big pain that needs resolving, what we actually need and what we actually mean by that term. We use ambiguous terms. Even if we collected one from each person here with narrative and process about why that is the big pain that would get us a good step down the line. That would be a great way to start to move the process forward.

Simon: So should everyone provide a use case?

Shawn: Yes, that would be fantastic. We will go through all the materials we are recording here today but if people can go into that bit more detail and elicit what those needs are that would be fantastic. If we can task that out to people that that would be fantastic. Even if it is just one use case each.

Comment: I think that it would be important to provide some form to fill in to help us to provide you with those use cases in a consistent way.

Comment: The cookbook approach sounds good. Each year the barriers to entry for this stuff get higher so I’m hoping for something more – step by step guides.

Leif: I am concerned about mission creep. About making things to vague to be useful. But we could look to provide a “You Will Need…” list to help explain the sorts of resources one will need to have on hand.

So… what will be happening next in this project? Well we will be writing our report for the end of January 2012. We will create some sort of wiki or forum to encourage people to contribute and comment. But we may ask you to target your knowledge and we certainly encourage you to engage in that process as much as you want.

We will also be planning our next workshop and will be in touch about that.

Finally this is our very first workshop so do you have any feedback about this event. Good or bad.

Comment: I really enjoyed all of the talks but now I know what you want to achieve it might have been useful to step away from the theoretical and look more at the pragmatic issues, the way in which issues are currently being addressed etc.

Yes, I understand that. You don’t want to be too pragmatic but we definitely take that on board.

Comment: There is always the issue of 100 flowers type thing. Balance needed between structured and unstructured?

Comment: Today wasn’t very structured and for a first event that feels right, perhaps the next one might be more structured.

Comment: It seems that there is no agreement on the spatial issue, theoretically time and space is the same thing. Can we be clearer on what we are talking about? Is there a case for making event centic say? What’s the balance between finite and infinite here.

I think this is a contentious space to be honest but maybe… is there concensus here. Perhaps there were issues in how we described the event today.

Comment: It’s terribly difficult to come up with a conception of space and time at this sort of workshop.

Comment: I agree… but if we don’t… who will? So for example if we treat them as two entities we treat them differently from treating them as one thing.

Comment: I think the way to broach that is to air the problems and find areas of commonality and shared issues and working on concepts to solve those problems. I don’t think that we want to chisel up the concept.

Do people feel it’s helped them personally in thinking through these issues and awareness raising today?

In terms of our report we will share and communicate our report and workshops will be on other issues of space and time – GIS, web mapping etc.

Eero: the next workshop will probably be in Hamburg at the DH conference. We can put out the theme list in call for papers and if we have proceedings of that workshop then there are already useful resources likely to come out of that.

Would you be interested in timelines, chronologies etc…

Comment: Well this is old news for me…

Shawn: But actually this is a disciplinary issue. So many new digital humanities people are entering this space and are new to this and we need to be able to give them some different expressions of these sorts of issues and ways into these areas.

How about GIS/Webmapping?

Comment: I don’t know, it would not be of interest to myself.

Comment: Generally about visualisation I probably struggle with that. I’d like to see a wide range of approaches. Specifically as it applies to space and time.

We do have another working group in this area so we don’t want to tread on toes… we need to balance what we do with the work the other groups are doing.

We will be putting out calls for papers, and communities will be brought together and you’ll hear about that as it moves forward.


Tomorrow is the Pelagios2 Hackfest (we’ll be liveblogging this). The idea is to explore open resources that are available related to history, culture, heritage using geography as a point of reference. Pelagios2 is based around the ancient world but actually the day is broader than that. We’ll have tech specialist and domain specialist and we’ll be coming up with quick wins and pain points in interlinking open heritage resources with geospatial concepts. We hope to find out what is easy, what is valuable and what can’t we do and why.

And with that we are closing the day with a giant thank you to all of the speakers, organisers and those recording the day.


JISC Show & Tell, Timelapse and Awards Results

This post has now been updated to reflect the results of the JISC Geo Awards.

These timelapses show the morning and afternoon  Show & Tell session at the JISC Geo Programme meeting:

Morning Session Timelapse

Click here to view the embedded video.

Afternoon Show & Tell Session

Click here to view the embedded video.

Following the Show & Tell session the JISC Geo Programme Awards were presented to projects in the JISC Geo programme based on their achievements to date and, in the case of the Best Project award, the votes of the community collected at the end of the Show & Tell Session.

Project Blog Post (Single Entry) of the Year was awarded to xEvents (#xevents). The nominees were: GEMMA (#gemmaProject); IIGLU (#jiscG3); NatureLocator (#naturelocator); xEvents (#xevents).

Project Blog (Overall) of the Year was awarded to IIGLU (#jiscg3). The nominees were: GeoSciTeach (#GeoSciTeach); IIGLU (#jiscg3); NatureLocator (#naturelocator); PELAGIOS (#pelagios).

Project Manager of the Year was awarded to Amir Pourabdollah, ELOGeo (#elogeo). The nominees were:
Amir Pourabdollah, ELOGeo (#elogeo); Chris Higgins, IGIBS (#iGibs); Stuart Macdonald, STEEV (#STEEV); Elton Barker, PELAGIOS (#pelagios).

Hybrid Project Manager/Developer of the Year was awarded to Nick Malleson, GeoCrimeData (#geoCrimeData).

Project Developer of the Year was awarded to the GEMMA Team (#gemmaProject). The nominees were:
GEMMA (#gemmaProject); Halogen 2 (#halogen2); PELAGIOS (#pelagios).

Project of the Year was awarded by the JISC Geo Community, via voting at the JISC Geo Programme meeting, to NatureLocator (#naturelocator). The nominees included all 12 of the JISC Geo projects.

If you are one of our fabulous project winners please do feel free to post a copy of your certificate on your own blog. You can download them from the JISC Geo Flickr Set.


JISC Geo Programme Meeting – Day Two

Today we are in day two of the JISC Geo Programme Meeting and we are liveblogging as appropriate – so any spelling issues etc. will be corrected as soon as possible – please do comment on content etc. below.

David Flanders of JISC is introducing the day: The aim of today is to identify recommendations for the future.

There will be three sessions which run as presentation and then break out groups around a theme. Each table will have a scribe. The goal will be to discuss potential recommendations for how JISC should advance spatial. That recommendation will then be written up by the project manager and the scribe for a given group and posted on that project’s blog and they can then be looked at further, a soft of ad-hoc community consultation. We will run this process three times today.

Training Non-GIS Experts in the Use of Geospatial Tools and Technologies at Stanford University - Patricia Carbajales

I will be talking about the way that we support our community around geospatial tools and our approach. In terms of putting this into context we started in the 1970s seeing GIS used mainly by developers but we have now reached the point where an increasingly broad group of users use GIS technology, in 2011 we have general users engaging with these tools. And there has been a real evolution in GIS. We have moved from being the provider of map data, we are abkle to provide tools that assist decision makers. We are moving to a place where there are hundreds or thousands of geospatial data users who really don’t care that much about the quality of the data and we have to give them the basic technology to understand and use the tools and data. And we are also looking at how those results impact our environment and our society.

GIS in Higher Education enhances educational goals – that’s a really important message to get across.

We have a center for excellence in GIS and we want that to be a space where users can help and support each other. These can be really interdisciplinary user groups. It is important to have the faculty on board. No one department has full ownership. Everything is a communal good in this space. And we think that GIS benefits from being very unique and at the same time very diverse. Students are from diverse course backgrounds but we have to always be aware to be able to offer examples from their discipline or specialism. We have to make our support relevant to them and we try to create more of a learning environment than a traditional teaching environment.

Our keys to success start with ensuring students have a really sound basic understanding of the concepts and basic priciples. We need to teach basic mappiing know how. We need as experts supporting those students and faculty member we have to have suitable examples to hand from their field.

In terms of the principle causes for failure. We need to be aware about how we plan, manage and keep our support user/customer focused. We have to offer comprehensive, simple and flexible support.

The form our support takes is through class support – we work with classes where students have to learn ARCGIS in one week and do this through homework around that work so that classes can focus on using those tools. We also undertake project management for those class projects – to help find data and help do the analysis. Allowing the professor to focus on the application itself.

We also provide instruction, consultation, data resource center and support center for all members of the university.

We then also collaborate, provide a daa resouce center and offer technical support for the specialist spatial history lav, digital humanities lab, etc.

And finally we undertake outreach work with the wider community.

At Branner Library we provide a center where students and staff can come in and get one hour of intensive one on one support.

So, we try to encourage “thinking with maps”. We focus on GIS education to raise awareness to stimulate interest and provide a sound foundation. And we provide a learning environment rather than a teaching environment. We see this as a sort of pyramid of engagement that begins with awareness and peaks with higher level modeling applications. At each level of detail fewer users need to gain these skills but all have a route to reaching this high level understanding of geospatial.

For the higher level skills we hold workshops and these are hands on and take place as needed by students – we don’t make students wait for a full session to run if they need that support ow. We follow on with consultation on a one to one basis. We tailor to cover most frequent needs. We also push students to practice between sessions – I always tell the students that like tennis there is no point showing up for a session every few months if you have not practiced in between. We also gather feedback

The workshops are always hands on. In one workshop they find out how to make a map from the beginning to the end.

Right now we use ARCGIS, it’s waht the US market demands and it allows deep analysis of the data. We have a campus wide licence and we get free support from them around that. But we also use Google Earth and maps because it is easy and familiar to students and that works well for collaborative use of geo or publishing of data

We would like to offer other more specialized skills for spatial analysis but only where demand is demonstrated. Tools like PostGIS are too niche to need regular workshops to be run at present.

Our main objective is to establish a geospatial foundation for our learners. We have limited resources and some very specialised groups. Faculty;s involvement is critical but often that is not easy but it does provide enforcement of fundamentals. The human resource of workshops is so important. If you take an online course then ask students to come back they rarely will. If you are there while they take a course and can then bring you questions it makes a big difference. The majority of students like that human interaction very much.

Increasingly we are thinking that expanding our support for programming languages such as Python would really help us with what we do


Q1) Are your training materials available for others to use?

A1) Our materials are public and available for others to use, especially the Google training. In most of these workshops there are training materials a well as tutorials around that. We do demos every 15 minutes in those sessions, they are very interactive and all join in. We take it very slowly to encourage them to understand what they are doing throughout. We teach undergraduates and postgraduates in the same way.

Q2) Do you do any tracking of students that go to the workshops?

A2) We have one to one consultations a week after the workshops. But three hours is a huge investment for the students and the hour of one to one consulation is hugely helpful to students so they usually come back again.

Q3) We just produced a Python for ARCGIS course so you’d be welcome to use that from our website!

A3) Thank you, we need that!

And now onto our next presentation:

Mapping the Republic of Letters - Nicole Coleman

This is an in-depth look at one geospatial project. This is different as we really don’t use GIS in this project but geospatial is increadibly important to this project. This is particularly inspired from the historical mapping in the period of the letters we are looking at (1500-1870).

We take inspiration from early maps in the way in which the maps themselves are a reflection of the perspective at that moment in time. This feels relavent to how we visualise and how we map the correspondance we are looking at. In fact we created a timeline and map for the intellectual property around our project. We are trying to think differently about space and time for this material.

One of the things we have been doing is to try and establish visual ways to browse and explore the data to enable scholars to find suitable materials, to navigate, to understand the choices they are making for a visualisation. The original materials are always linked back to their original archive copy so you can explore the historical resources you are visualising and know where they have come from.

I’m just going to walk you through a few case studies to illustrate the challenges we have,

So, looking at the Athanasius Kircher, we only have letters sent to him, not those he sent. Paula Finland was the lead on this project and she was keen to look in more detail at the nature of the letters. We have used Fineo, a multidimensional content viewer that allows you to look at the locations, the languages and, because it is relevant to this histoical figure, the faith of the correspondant to understand their work.

British and Irish Travellers in Italy – students went through this dictionary of travellers. These are really detailed entries of arrivals and departures of travel although not all are consistently detailed. What has been interesting about looking at the areas recoded is that sicily is treated as a city – a peculiarity of his archive.

We can take this data and look at who was in a city at the same time. You can look at particular periods of stay or particular individuals. You can also connect to data on the individuals involved with information on that person’s age at the time of that stay, etc.

So looking at the temporal context was really helpful but did not give us the complete picture but we also wanted to look at relational information. So we have a kind of a network graph tool. So in this visualisation a dot indicates a person, blue lines indicate a very loosely defined relationship.

These tools are really exciting for exploring this sort of data were connections are just not that apparent but can be discovered and explored through these sorts of tool.

Voltaire’s correspondance is our largest data set. He was very very prolific. We should note here that the tool we have developed uses contemporary country boundaries because there are not good shape files available for the geography at the period although for our research it is actually more important to look at cities really. We have also tried to indicate on the timeline shown with the map where letters are available but are not mapped. This is really important as it tells you how representative the visualisation is of the data.

Looking again at a map with the tool Inquiry, a map of source locations for letters written by Voltaire and we can see most letters do not include the source locations.

Putting letters on the map draws our attention to these materials in a different way. So for instance this letter from Panama becomes very visible. When you look at this letter the content may not be so exciting. This is another way of understanding the data we have and which materials are and are not significant. And indicate trends or unexpected patterns in letter sending – for instance few letters are exchanged with those in Spain and indeed looking at a letter exchange to Madrid this turns out to be to a non-Spanish correspondent staying in Madrid.

Benjamin Franklin, Caroline Winter is working on this project, and we’ve been looking at a comparison of exchanges of Voltaire and Benjamin Franklin. They had common correspondents but do not appear to have corresponded with each other. But you can see various second degree connections between Franklin and Voltaire. So we take this data out of a spatial context and out of a temporal context for this specific network diagram. We balance this sort of network relational graphing with spatial temporal contact visualisations.

We can look at Benjamin Franklin’s network at the time of his stay in London. And in this case we compare with the network of David Hume. Looking at how that experience connected him to the Scottish Enlightenment (hence Hume used here).

We are now moving to breakout groups so blogging will pause for now.

So, we are back after a most excellent lunch and discussion session!

Presentation of an ‘Emerging Geospatial Innovation Themes Map for UK Universties and Colleges’ by Gregory Marler (Programme Evidence Gatherer for the JISCgeo Programme), No More Grapes Ltd.

Gregory is next up to present. I’ve been reading the various project blogs. Geospatial is obviously frequently mentioned in your posts but also data is a huge theme as is usability. I tried to group all the projects according to how they are using data and who they are using it with.

There were some interesting posts on people and geospatial data. So JISC G3/IIGLU did some usability testing on Potlatch, one of the editing tools for OpenStreetMap and that was hugely useful to the OpenStreetMap community who made some changes to the interface as a result.

There was also discussion of teaching without calling it teaching – things like GEMMA that shows users through example and practice so that users can get their hands on the data. NatureLocator also encourages users to learn and explore more.

It’s important to keep telling people about your projects – forums, blogs, tv (if you can, meetings, word of mouth, even emails are important. You’ve probably collected email details from people to try the Beta – have you told them again that they can try it now the bugs are fixed! Remember to update your potential users with any key changes you make. And you need to make sure that you flag up key information to all your audiences – techies want information on the end service as well!

Some of the blog posts were long but most were nice and short and readable. Images are particularly powerful – particularly screen captures of emerging products. There was real variety of scope, some about the technology – and great sharing of experience there – some about the research. The key message here was learn and have fun – it’s been huge fun reading the blog posts!

And I will finish on a joke from a project that posted a whole page of lightbulb jokes!

“How many green building consultants does it take to change a lightbulb?”

“None, we were all at a conference!”

And so for a wee while we will discuss project blogging a wee bit.


Comment) Our lab blogged an awful lot. But JISC work we didn’t blog much. We were so busy doing the work we didn’t have much time to blog. We thought we would do a lot of blogging but actually we didn’t want to give too much away so we were fairly quiet.

David) But you were coding hard. I feel like blogging frees you up to share as you go, when it’s useful, rather then writing a big final report. And the average was 17 blog posts which is equivelent to

Comment) We found blogging really useful as we were a consortium as it was a way to track progress to have a reason to share expertise and chase project partners. And work got done quickly and efficiently and we had lots of interest in

Comment) In terms of blogging the final report is a write only document. I’m not sure they are ever read. Blogs are read. You pay people to write stuff so the fact that they are actually read by potential collaborators and the community.

David) So how many of you looked at the analytics? Or didn’t?

Comment) I didn’t want to put pressure on myself, I just wanted to get started, to make links to other work etc.

Comment) I’ve not blogged before but have written lots of formal reports. It’s a really different way of communicating. If you can explain the concepts to a really novice user then you really have to understand your work. Thinking about that can really help you rethink what you are doing and throws up challenges for yourself. There can be real snobbery about these things, you had to be very formal in language. I don’t think it matters, you need to communicate the ideas across. I really enjoyed the different sort of writing we’ve seen this.

David) So how has this gone in terms of convincing your organisations about blogging? I know some institutions require really strict reporting?

Comment) We have rigerous internal processes for project management and the team struggled with doing something additional. We did improve a bit I think but we all struggled with being informal in that way, you worry about pressing publish. It is useful to see something a different way – and lighter more sensible way is nice. Trying a different methodology helps but it’s a start.

Comment) A concern and interest I have is about publication. Humanists tend to write journal articles – there is a paper there on the blog that just needs to be pulled together and allows me to reuse and publish all that work we’ve done.

Dave) There is a real mixture on those blogs: serious research work, light and silly content, project management, technical discussion. We had Greg there as a recent graduate to be a reader for these blogs – to give an outside view on what was working well, who was enjoying the blogs.

And now we are having our next breakout discussion, this one focussing on geospatial data and the needs for creation, management, repurposing, expressing, analyzing and sharing of geospatial data.

And after that lively chat we move to the last presentation.

Presentation on ‘The Myth that is Project Sustainability’ and ‘Future Strategic Funding Areas for JISC’ by David F. Flanders (JISC Programme Manager) and Matthew Dovey (Programme Director, Digital Infrastructure (e-Research))

Obviously any discussion of funding is subject to change. You need to speak to programme managers and there is some advice and guidance we can give as those that regularly read bids. Please don’t take my comments as gospel.

How many of you want to continue your project? And is it sustainable to continue?

And what would you want to sustain? Is it a product? the next big Facebook maybe?

Is it about skills? Those bespoke services at can translate into income from student fees

Staff? Attracting and maintaining staff is important to think about in sustaining.

Community? Change how we do things? Ironically this is the most expensive of the options but we do have a great community here, it would be super to see it continue to thrive.

My advice to you is that if you want your project to continue to innovate then you need to continue to bid. Bid bid bid. It’s not fun, it may not be perfect but it works, it produces products in an increadible way. In you moving forward you are going to bid for more things. But where will you bid? A lot of our projects are moving from a research project that is very innovative into something that can be taken forward. Maybe a product, maybe skills, maybe something else that can be taken forward. Which of these things can be taken forward.

I really do believe that spatial should be across all of our activities. I am going to try to show you some future plans of the JISC teams to get an idea of where spatial might fit into that bigger picture.

So, here’s the big pucture. We have a top level strategy, we innovate, and we take some of those into services. That balance between innovation and service can be tricky. If you are interested in creating a service it’s not as nice a space as an innovation space. Business managers, legal teams, etc. come into that. There is much more there than just the product and the vision may be very different from your original innovation.

In recent years our budget has been a pretty good split between innovations and services. You should really go for that innovations chunk of the pie chart.

So we have an overview of the people of JISC and that is important if you are looking across the full spectrum of JISC activities. Under innovation there are four teams: learning; user and organisations; content; infrastructure – huge potential across all of these for geospatial. And the key names here are Tish, Craig and Catherine. If you have a project that applies to these strategic thinkers then contact them, email them, call them, ask them about upcoming funding. A little bit of effort can really help you in feeding into these programmes, to hearing about the opportunities.

My boss at EDINA is Rachel Bruce who leads the infrastructure team. We are the largest team in innovation. It’s not a bad idea to know about our team going forward. In addition to Rachel she has two directors working with her inluding Matthew Dovey who is here to walk us through some of the new diagrams and branding we are currently thinking about:

In terms of infrastructure as a whole we have three broad areas: Information and Library Infrastructure, Research Management, and Research Infrastructure – about doing that research.

Digital Directions is a diagram that shows elements that underpin these themes includes geospatial, authentication etc.

If we look at library and information systems we have areas there around emerging opportunities, resource discovery, and curation and preservation. In Research Infrastructure we have research information managerment and research data management – the support available to the researcher. What are the tools that research teams need? How do we feed recognition for teams into things like the REF etc. And in Rsearch management we have research support, research tools and repositories and curation shared infrastructure – the ways in which data can be reused, preserved for the future etc. on a technical and social level.

The key thing about geospatial is that it features in all of the areas here. So when do we keep this integrated into wider programmes and when do we fund geospatial as a specialist area. And on that back to David.

So that was a very whistle stop tour of a very varied portfolio. The main message is please do bid. And here is my bidding advice:

  • Contact the programme manager by phone/skype and tell them about your idea to make sure it is in scope to meet the strategic objectives. It exponentially improves your odds.
  • Add a use case and image/diagram on the first page of your bid. Most reviewers read 5 to 10 bids at a time so they have to be readable.
  • Repeat back whats written in the call – you really need to make sure your bid clearly indicates how your idea meets the call and why.
  • Less is more. Five pages with diagrams is great
  • Focus on what you are going to do, rather than why it is important
  • Say which human is going to do what – the more specific you can be, the better. It helps people understand the intimacy of the bid.
  • Clear budget – explain why the numbers are as they are. A Paragraph with percentages is really helpful… x% will go into development, x% to dissemination etc. That’s really important for markers.

And an addition from Matthew: the focus moving forward has to move from a geospatial led activity to an application led activity. Think about these things as a researcher led proposal that answers a real problem. Embedding those tools is essential. Think about sustainability. Bidding for more funding is a sustainability model but that is questioned at a certain point. Our funding is finite so are there are other revenue streams. Can you commercialise? Can you charge poeple outside of UK Academia but keep it free to HE? Can you get some cost recovery from your host institution? Just have a think about those elements.

Please do take advantage of Matthew’s time this afternoon with any questions.

So with that here’s the next few days lined up… tomorrow we have the Space-Time workshop and also the Review panel going on in parallel. Then on Thursday we have PELAGIOS2 – an open hackday. And in parallel we have the Geospatial Service Review session.

One last reminder. We want comments on how we can improve what we do. So do fill in our survey!

And finally I have run 8 programmes over the years and this has been one of my favourites. Your work has impressed me immensely!




JISC Geo Programme Meeting – Day One – Show & Tell

Today we are blogging from the JISC Geo Show & Tell event taking place at RAVE (Ravensborough College), which sits right next to the Millennium Dome/O2 Arena in London. The hashtag for today is #jiscgeo – please tag any of your own blog posts, images or tweets with this. The full programme of JISC Geo Programme Meeting events over the next few days can be found here.

The day will split into 2 halfs. This morning we will have a Keynote from Julie Sweetkind-Singer, Assistant Director of Geospatial, Cartographic and Scientific Data & Services, Stanford University. This will be followed by an introduction to all of the JISC Geo projects by David F Flanders, JISC Programme Manager for Geospatial Innovation. Then we will break for lunch and in the second half of the day there will be a Show & Tell session where each project shows off their work around lab-style tables. We will be liveblogging the first half of the day but then manning both a INSPIRE table and a JISC GECO events table so we will blog highlights of this afternoon towards the end of the day.

David Flanders, is introducing us to the day with an alert to keep that QR code reader app handy – there’ll be lots of QR codes appearing through the day… Also there will be blogging, tweeting, images, videos, etc. going on all day. All of these should be available under Creative Commons licences and available after the event. Please make your posts etc. available similarly and use the #jiscgeo hashtag.

We are just having a wee introduction from David but first he’s had us up saying hello and chatting with our neighbours.

The Aim of today is:

To figure out which products are going to help catalyse the spatial revolution in .AC.UKs!

We want to change the sector with the way we do things, the way geospatial is understood in the sector. David has a big spatial agenda: the sector is unconsciously using spatial in their activities (teaching, learning, research). He asks how many of us remember first looking at Google Maps – very few people do – and then how many of us remembering looking at a satellite image of our house – most of us do. We need to get the sector to recognize how geospatial can be consciously recognized and capitalized upon.

This afternoon we will see 15 brand new geospatial products. And we want your crowd knowledge of the best project, the best pitch. The format will be an unconference format. It’s all about people NOT sitting through presentations. You need to move around and interact with as many projects as you can. Use the “rule of two feet” – if you’re not actively participating walk away, try another group. We will have 6 simultaneous groups and loads to hear about. No-one will be offended if you walk away. We are going to do a wee vote so please vote independently. The Wisdom of Crowds is much more useful if you all submit your thoughts individually. Think of it as a country fair/livestock fair – and yes, I will be using a megaphone to heard you.

I want us to think about how we can get our institutions to understand that they are using geospatial (even though they don’t think they are). I’m really pleased to have Julie Sweetkind-Singer here as I think she has a great model for this. I visited her about a year ago and they are doing some fantastic things with geospatial.

Keynote: Julie Sweetkind-Singer (Assistant Director of Geospatial, Cartographic and Scientific Data & Services, Stanford University)

David came to speak with us about a year ago to talk about geospatial on the campus. I will be talking about geospatial outreach at Stanford. Some will be around work the library is doing, but also around campus. One thing to know is that the library system is quite large – around 500 people in total – which helps us to do this sort of support. I have one colleague here and another arriving this afernoon who will be happy to talk with you, answer questions etc.

Stanford has around 7000 undergraduate students. About 34% humanities and social sciences, 13% engineering students, 2% earth sciences and around 51% undecided. We have around 9000 graduate students and the biggest department here is engineering (39%) and about 3% in earth sciences. We have around 1900 faculty members. Stanford’s nickname is the farm. Originally there was a lot of farming, a lot of animals, etc. the campus is 24 miles and there is a biological researve doing lots of geospatially related research.

We are expected to innovate and do new things – even in the library – and that can cause silos and problems sharing between silos.

We have a series of centres working in spatial research that sit separately. The Spatial Analysis Center (Earth Sciences), the Natural Capitol Projtec (Woods Institute for the Environment): conservation projects – they produce lots of open source tools that are available online, etc.

The SU Library’s Technical Infrastructure is that there is a Branner Earth Sciences Library which has a computer lab with 8 high end machines with dual monitors and expert staff on hand. We have Site Licences for ArcGIS on over 800 machines. We also have Google Earth Pro on all of those machines – people want to visualise their data instead of or as well as analysing their data. We are developing a Stanford Geoportal which will allow data to be available to the Stanford community, and if the data can be made open, shared with the world. We have also been working for about 6 years on the Stanford Digital Repository 0 it’s incredibly important that we take care of and manage our digital assets just as much as we would a  physical asset. We have a team of programmers who work of that repository. There are about 4 to 5 Terabytes of geospatial data in there right now, it’s very important in terms of supporting the work we do.

In terms of the broader infrastructure we provide support for anyone who wants our services across the university. We provide information, support, and that is across disciplines and across the organisation. When I first work at Stanford I had a colleague who would say how few people were doing geospatial work but now my colleague in that role has to keep people from her door there has been such an explosion of interest in geospatial recently.

We try to provide access data, software

basemap data

Patricia Carbajales is the Geospatial Manager. She ensures that we support the aspects of GIS that are a commodity – the bread and butter skills you need to work with your data, the underpinning to allow you to do projects on your own. So she has delivered over 100 workshops to over 450 students in the last two years. The work is integrated into classes – through principle instruction of Fundamentals of GIS and being technical assisteant to Urban Mapping Practicum. There is also student project and research support with advance taining for power users. There have been huge outreach efforts to use geospatial in projects with the community. Patricia is a geographer by training and she has 20 hours of work per week across two student support officers.

One of the things that Patricia has done is to set up a Google site to highlight the geospatial training at Stanford University. We recently did a Google Mapping Workshop and had people over from Google looking at how to use the software, how to use fusion tables to import data, how to create narrated tours etc. She created resources around this, presentation tools, programming code. etc. ALL of those materials are now online and open to all so that anyone can use these resources.

We also want to see how we are doing, if we are providing the right sort of support. We get feedback from all of our students immediately so that we can immediately feed this into future work.

We have formal GIS/Spatial teaching across the university. In Global Positioning Systems (Aero/AStro); Digital methods in archeology (anthropology); quantitative analysis in Archeology and Anthropoligical Researchl Spatial History: Concepts, Methods and Problems (history);… geospatial sits across all schools. But there is no geography department so students do not neccassarily feel confident about the wider context beyond the tools they have been using.

So, if we look at who took the Fundamentals of GIS modules from 2002-2008 we see that Civil and Earth Engineering, Earth Systems, etc. all very high usage but there is a long tail of students across other schools. Since 2009 we’ve seen huge increase of students from Earth Sciences, Uban Studies is third biggest group and biology, etc. have seen huge growth in uptake.

We are also involved in a serious effort to create content through scanning of maps, We have about 3000 maps in the Branner Earth Sciences Libary. We have about 5000 antiquarian maps. We have been scanning all maps that are out of copyright. We have also been working on a project with private collectors/donors called “Digital Philanthropy”. We approach map collectors with exceptional collections and asked them to let their full collection be scanned and shared with the world. So far we’ve been very successful in this approach and we are just getting our first set of collections of these materials scanned – this is material on historical maps and views of California. The next set will be a collector of maps from around 1600. All of these maps will be free on the web and downloadable. There is immediate use of these materials in other projects.

So now I want to look at Geospatial projects both within and outside the library.

The Spatial History Project was begun when one of our professors was given money from Carnegie Mellon Foundation and he decided to start a Spatial History Project. There were 2 members of faculty in this project, there are now 13 members of faculty. Initially that academic was particularly interested in the history of the railroad. But they are really bringing undergraduate students into the research process at a really early age and getting engaged with these researchers looking at social history etc. You can find this work at:

Another project here has been Tooling up for Digital Humanities – a site full of tools for working in the digital humanities. The project was a collaboration between the Spatial History Project ad the Computer Graphics Lab. There was a weekly workshop series in 2011 with library presenters accounting for four of the eight workshop presenters.

Nicola Coleman, my colleague here, is an Academic Technology Specialist (ATS) and runs the SU Humanities Center Research Lab. Part of what they do is large scales international collaborative research. Thy are looking at how to do spatial visualisation of data rather than geospatial analysis exactly. So for instance they have visualised the flow of letters from Voltaire and Benjamin Franklin, it really helps visualise the spread of ideas.

Anoher of my colleagues from the ATS programme is Claudia Engel and she is assigned to the Anthropology Department. She also does research using historical satelite imagery. She also teaches a Spatial Approaches to Social Sciences class which again is totally cross university.

One of our colleagues, Elijah Meeks, is a Digital Humanities Specialist works directly with academics. They suggest projects and he works with one of them on a specific proposal for six month windows. He moves project to project but he stays at the university. He helps the faculty find grant funded research, he helps them find people to work with etc. If you are interested in network analysis then he is a big blogger at: He does a lot of work and training with GEPHI.

In terms of outreach and Collaboration: We have a GIS Day and we also did a Geography Weeek with a real variety of speakers We have a GIS Special Interest Group – which is also beyond the institution work. We have been working with the New York Public Library on their Map Warper appliation. We sponsored WhereCamp 2011. We have collaborated with Google on workshops. And we have participated in ThatCamp, Visualisation MeetUp Group, etc.

We have some challenges here. Demand is up and we have competing needs as well as increased demand. The complexity of the software/the high learning curve of those software is a real issue. We find students and faculty increasingly come in with really robust computing infrastructure needs. That’s one of our biggest pain points right now – how do we support this ever increasing need for computing support from students. And finally we do have a lack of coherant curriculum for teaching spatial thinking and methodologies but you can see where improvements are needed when you work across the university.

We do try to make sure we support high profile research. We ensure our services have very high visibility. Demand does continues to grow.

Coming up we have the VITA (Visualization and Tetual Analysis) Centre – some very interesting collaborations likely to come out of this. Library/faculty collaboration to capture, distribute and retain faculty data. We have a real role to play here in our GeoPortal and in in our Stanford Repository. There has been some expansion of digital humanities support and we have some expansion of the digital map support.


Q1) How do you justify your spend on geospatial to your institution?

A1) Well we wanted a geospatial programmer and our head of library proposed this to our budget committee and they came back asking what GIS actually was. Within two hours we had to tell them who was using GIS, which faculty members used it (and they had to be the right ones), and we had to show how the outreach value of GIS. We had a database of GIS related publications, who on campus was using this stuff. We were able to produce a two page summary in an hour and we got that post approved. And we did send GIS day and Geography week information to our senior univeristy management – I’m not sure if they did but they heard about it.

Q2) Can you say some more about the uptake of those GIS classes by students in other schools?

A2) An interesting question. The number of history students taking GIS classes is very low but they use geospatial data really heavily.

We’ve had a really hard time getting into places like the Business School. They are very set in the classes they want students to take, and getting in there has meant getting professors who are interested. Students can drive uptake of geospatial but when they leave so can the interest in geospatial. Where you have faculty that really understand the software, the methodologies. But if you move department you might be re-taught in GIS or may have no opportunity to gain those skills

Q3) Have you done any outreach to Schools?

A3) We have been working with California State University who have a Geography programme and we employ them for our labs to support our students. But we haven’t been working with K through 12 yet.

Q4) You talked about the Digital Repository. What form does that take to help support your work?

A4) When they started to build the Digital Repository it was built to hold any content, not just text but all sorts of materials. That was partly because we had a project with the Library of Congress to look at the metadata that one needs for different sorts of data.

David) Actually the connection to Julie did come from the repository community.

Q5) Digitisation project – any intellectual property issues etc. Especially with the philanthropy projects.

A5) What’s quite interesting about this. We have been scanning everything pre-1923 and thats all in the public domain. We are proposing that anything pre-1923 that is in the public domain stays in the public domain. Its the role of the library to make things available. So we make sure that anything pre 1923 to view and for downloading. We’ve seen more libraries in the United States doing this. So when it comes to the philanthropic projects there’s an element of psychology here. There is some guilt that those materials are locked away and inaccessible. We have our donors come in and speak to the spatial history group. They speak with Richard White on the importance of that work. Then they are happy to sign that contract to make their collection made available. A real thought leader here is David Rumsey, a map collector in the Bay Area who has already made his maps available. Some of our collectors are planning to sell those maps on at some point. Legally us scanning those maps with the owners permission is fine, even if they go on to be owned by someone else.

David: What’s exciting to me is that this is a glimpse of the future. And we can now think about how we take forward our projects and our institutions to move them forward.


Introduction to the products on display for the day, by David F. Flanders (JISC Programme Manager for Geospatial Innovation)

In introduction to this afternoon’s unconference side of the day I am going to give you a quick introduction to all of the projects you’ll be seeing this afternoon. Of the projects I’ve been involved with I think this has been a fantastic successful programme. Normally I would expect maybe 3 or 4 products that will be usable, useful in the future. My role is to help you explain why should continue to invest in spatial infrastructure.

The total investment is just short of £1 million now. It will probably get above that once we reinvest in some of the successful projects. We need that sort of investment going in. We’ve had 20+ universities, 7+ non UK univeristies. We’ve had 122 months of projects in total (all are between 6 and 12 months). 104+ staff funded (for some of their time). We have had about 10,000 unique hits on our websites. Over 300 blog posts (from 15-50 blog posts). I will be giving a prize for the best single blog post and an award for the best overall blog. It’s changing the way we can communicate our work by doing documentation this way. Many of you will still need to fill in your final sign off survey and we’ll know even more about how the programme has run once we have those. We have 48 products in total from my count. And in terms of what is picked up going on that will determine the success.

So first we have the STEEV (#e3vis) project. Working on visualisations around geo and over time.

Next the ELOGeo (#elogeo) project. They have created teaching and learning material around geospatial. We have about five people here from that project.

GEMMA (#gemmaproject) and it’s really exciting! My 12 year old cousin digs this one!

GeoCrimeData (#geocrimedata) is here – they have some cool data looking at dangerous areas etc, definitely check them out.

GeoSciTeach (#geosciteach) – this has huge potential to go viral. They have an epub, an app and a fantastic blog explaining their case studies and how it works.

U.Geo (#geoukda) – they’ve done great work on INSPIRE and the social sciences, they’ve really gone into the nitty gritty and what that’s involved. See also JISC GECO’s work on INSPIRE.

Halogen2 (#halogen2) – they’ve got a great product here to vistual data.

IGIBS (#igibs) – I just got a chance to play with it – a great way to see the implications for geospatial and it’s importance in various area.

IIGLU (#jiscg3) – great branding, great video, great content on learning adn understanding geospatial

NatureLocator (#naturelcoator) – fantastic app, also super back end really bringing geospatial to the masses, not just the educational masses.

PELAGIOS (#pelagios) – you have to check this out. Being able to see

xEvents (#xevents) – PhilPapers is a massively successful repository and this is really about engaging people in the spatial and the time. It does a great job of identifying and gathering events for any subject discipline.

JISC GECO (#jiscgeco) – This is a special project, an umbrella project for the JISC Geo strand. They have a stand here all afternoon and you should go see them. They can advise you on

Greg, the peg, he’s been reading all the blogs, reading everything and tomorrow he will be summing up all of your knowledge over the 300 odd posts. He’s been doing some great posts for us as well.

Some other projects here include Wiserd, they are doing some fantastic stuff in Wales. We will also have someone here this afternoon from HistoryPin which is a product from We are What we Do. They are kind of a not for profit company working with Google to build a long lasting product. They are a challenge for you in thinking about success and sustainability.

And last but not least of course we have the vote. It is intended to be each of you individually thinking about what inspires you. The prize maybe isn’t that exciting but it would be great, especially if you can explain why it’s exciting and what it can achieve. We can feed back to the projects to help them prove their success.

I am trying to make the case that we can move forward with some more projects in this area. That evidence is so so important. You meeting up, you making some noise is so important. Please tweet and blog and push this forward in the future.

So go between all of those tables this afternoon, see them all! Think about which ones you really do want to see!

And with that David is closing off the morning session and we are getting ready to go to lunch. We will not be liveblogging this afternoon but keep an eye out for the #jiscgeo tweets and our own @jiscgeco account. We will add a new post with the highlights of the show and tell session later today so keep an eye out for that and do leave us your comments on the projects today in our dedicated JISC GEO page/Visitor Book area.


IGIBS Final Product Post

“An INSPIREing tool enabling researchers to share their geospatial data over the web”

The Open Geospatial Consortium’s Web Map Service (WMS) is a core standard underpinning many Spatial Data Infrastructures (SDI) throughout the world.  This includes INSPIRE, the UK Location Programme and our own UK academic SDI.  The WMS Factory Tool created by the IGIBS project; for the first time, allows users to upload their data and automatically generate a fully standards based, INSPIRE compliant WMS.  Users can control styling and view their data alongside a broad range of other data from a broad range of content providers.  The WMS Factory Tool has been created in partnership with Welsh Government and students within UK academia in anticipation of the revolution in the use of Geographic Information that will come about through the increasing availability of data via interoperability standards in conjunction with the UK Location Programme and INSPIRE.

The WMS Factory Tool was developed in close cooperation with students at the University of Aberystwyth’s Institute of Geography and Earth Science in the context of their growing repository of data related to the UNESCO designated Dyfi Biosphere Reserve.  If a student is doing a project and generating data, and they need to be able, for purposes of analysis and integration, to view that data alongside data from the spectrum of Welsh public authorities establishing INSPIRE compliant services, then this tool lets them do so quickly, without the need to waste time sourcing, extracting, transforming and uploading data from a range of non-interoperable proprietary formats.

The working prototype has been developed and configured so that data is uploaded to EDINA machines.  The following video gives a flavour of how the tool works:

Click here to view the embedded video.

Note that as an advanced feature access can be restricted using Shibboleth (open source Security Assertion Markup Language implementation used in the UK Access Management Federation) so only authorised users can access the service and so that other organisations in the federation can make more data available.

The software is easy to deploy and configured so that data may be uploaded and WMS generated at user specified locations.  Here is a good place to start with documentation.

And here is a picture of the team that brought you this product.  More information on IGIBS can be found throughout this blog starting with the about page.

Core IGIBS Project Team at Welsh Government Offices in Cardiff on the 11th Nov, 2011

The software is in prototype at the moment, but is in a condition where it can be deployed.  EDINA commits to maintaining this software for a minimum of 3 years, ie, until Nov 2014, though it is likely the software will have developed considerably by then.

It is likely that this software will contribute to the growing suite of open source tooling available for use with INSPIRE compliant services and encodings, most obviously as a means for users within the UK academic sector to create WMS (temporary or persistent) for use with UK Location Programme network services.

At its heart is the Minnesota Mapserver WMS software, very stable, well understood and highly regarded software.  The IGIBS software is available for download.  It is licenced under the modified BSD licence, meaning, in précis, that the software is made available using a permissive free software licence, which has minimal requirements in respect of how the software can be redistributed.

STEEV Final Product Post

This blog post provides details about the web tool developed by the STEEV project.

Problem Space:

  • There is a requirement by the UK government to reduce the country’s carbon emission by 80% by 2050.
  • Buildings account for 45% of energy use in the UK, the equivalent of all transport and manufacturing combined (ESRC, 2009).
  • Most building stock which will exist in 2050 has already been built.
  • To achieve this target massive alterations of the current buildings are required. Part of the solution would be a tool that could enable planners, local authorities and government to best estimate the impact of policy changes and to target the interventions appropriately.

Cue  – the STEEV demonstrator, a stakeholder engagement tool developed to visualise spatio-temporal patterns of modeled energy use and efficiency outcomes for the period of 1990-2050 –

For a portable overview of the project download the STEEV postcard

Primary Users:

Students, researchers, lecturers from a wide variety of disciplines/sub-disciplines, including geography, architecture, ecology, environmental science, economics, energy engineering and management.

The tool is also aimed at a range of stakeholders such as policy makers, urban developers, climate change specialists, carbon energy analysts, town planners.

Key Product Information – motivations and mechanisms

The STEEV demonstrator was developed to complement a larger project, Retrofit 2050 – Re-Engineering the City 2020-2050: Urban Foresight and Transition Management (EPSRC EP/I002162/1) which aims, through a range of stakeholders, to get a clearer understanding as to how urban transitions can be undertaken to achieve UK and international targets to reduce carbon emissions. The Retrofit 2050 project focuses on two large urban case study areas (Manchester and Neath/Port Talbot, South Wales – the latter being the focus of the STEEV demonstrator due to data availability within the project time-frame), through modelling scenarios of carbon emissions and energy use, both now and in the future.

The demonstrator itself is a client web application that enables researchers and stakeholders to look at how the spatial and temporal distribution of energy efficiency measures may impact upon likely regional outcomes for a given future state. This takes the form of a spatio-temporal exploration and visualisation tool for building-level energy efficiency modelling outputs such as the energy rating of the building, the likely energy demand of the building and the related CO2 emissions. A finite series of modelled scenario permutations have been ‘pre-built’ thus providing a limited number of parameters to be interactively altered in order to explore the spatio-temporal consequences of various policy measures.

View the STEEV Demonstrator Website: :

Note: A further workpackage to establish a small area data viewer as part of the presentation layer will also be implemented shortly. This replaces the Memento geo-Timegate component of Workpackage 3.

The user interface has two main areas of activity, namely:

  • three ‘pre-built’ policy scenarios which depict government investment in energy efficiency measures (from best to worst case scenario) and a user generated scenario created by selecting a combination of the energy efficiency variables which go to make up the ‘pre-built’ scenarios.
  • a map viewer that enables model output values (SAP ratings, Energy use, CO2 emission) for each scenario to be viewed for each decade (1990 to 2050) at Output Area level of spatial granularity.

Further information about the policy-scenarios and variable descriptions are available from the help page

Fig1. – The STEEV Demonstrator

STEEV tool interface

Fig. 2. – Policy Scenario 2 – Low Carbon Reference

CO2 emissions, 2010 - Low carbon reference

Fig. 2 – Policy scenario 2 – Low Carbon Reference (i.e. the government invests in partial decarbonisation of the grid through reduced dependence on fossil fuels. Large investment in energy efficiency and small scale renewable, some change in occupant behaviour) has been selected for 2010. CO2 emissions have been chosen as model output value.

Fig. 3 – User-generated Scenario

Energy use for Custom Scenario 2020

Fig. 3 – A zoomed in view of a user-generated scenario for Energy Use for 2020. Note: User generated scenarios are forecast only.

Fig. 4 – Policy scenario 3 – Google Earth Time Slider

Energy efficiency data can be downloaded as Keyhole Markup Language (KML) files for use with the Google Earth Time Slider (for ‘pre-built’ scenarios only – see below) or as raw ASCII files complete with spatial reference for analysis in a Geographic Information System.

Energy Use policy scenario

Fig. 4 – KML files viewed on Google Earth for Energy Use output model values for policy scenario 3 – (i.e. the government invests in decarbonisation of the grid through renewable, nuclear, and huge investment in energy efficiency and small scale renewables. Large scale change in occupants behaviour)

Fig. 5 – Model output for individual buildings

Model output for individual buildings

Fig. 5 – Forecasted model output values (SAP rating, Energy use, CO2 emissions, CO2 emissions based on 1990 levels) for an individual building in 2030.

Note: Click on Blue dot and select Buildings map layer.

Members of the STEEV project presented at the following events:

  • STEEV / GECO Green Energy Tech Workshop at the Edinburgh Centre on Climate Change (13 October 2011) – for further details see blog post
  • Post-event comments include:

    “STEEV provides a new simple tool to quickly visualise a series of scenarios concerning energy consumption and carbon emissions within the complexities of the urban fabric. By facilitating the visual and historical understanding of these issues in a wider area, and for its forecasting capability considering a series of energy efficiency variables, it has a great potential to assist the planning and design processes.“ – Cristina Gonzalez-Longo (School of Architecture, University of Edinburgh)

    The STEEV system’s geospatial information on energy consumption and CO2 emissions can help planners and project developers target projects and initiatives related to energy efficiency and reduction of carbon emissions. Furthermore, the forecasting tools built into STEEV enables energy and carbon emissions to be estimated through to 2050 on the basis of alternative scenarios for energy efficiency initiatives, renewable energy, etc. This facility should help to determine where the opportunities for future emissions reductions will be, and the contributions made by existing policies and plans to future (e.g. 2020 and 2050) emissions reduction targets.” – Jim Hart (Business Manager, Edinburgh Centre for Carbon Innovation)

  • The Low Carbon Research Institute 3rd Annual Conference held at the National Museum of Wales on 15-16 November 2011
  • Post-Industrial Transformations – sharing knowledge and identifying opportunities, a two-day architectural symposium held at the Welsh School of Architecture on 22-23 November 2011

The STEEV demonstrator is a JavaScript client application which uses Open Layers as the mechanism for displaying the map data over the web. It also deploys a Web Map Service with temporal querying capabilities (WMS-T) to deliver Ordnance Survey open mapping products via the Digimap OpenStream API. The modelled energy efficiency variables are held in PostGIS (an open source spatial database extension to PostgreSQL)

Data – Open Database License (ODC-ODbL) — “Attribution Share-Alike for data/databasesâ€�
Code – GNU General Public License version 3.0
Blog & other website content – Creative Commons Attribution 3.0 Unported License

Table of Contents of Blog Posts:

Project Logos:

combined logos of EDINA, JISC, WSA

Project Team:

STEEV Project Team

EDINA team members (L to R: Lasma Sietinsone, George Hamilton, Stuart Macdonald, Nicola Osborne. Fiona Hemsley-Flint is currently on maternity leave.)

Simon Lannon: Project partner from Welsh School of Architecture, Cardiff University:

The end of STEEV is (almost) nigh ..

Following on from the STEEV Usability Report recommendations and user feedback a number of requested features/functionality/bug fixes/tweaks have been committed to the EDINA Redmine butracker with view to implementation prior to the JISC GeoTools day. The resource required comes in at around 70 hours which is more than double originally estimated (due in part to the requirement to implement a ‘feature return’ functionality (at the polygon level) whereby a user can click on an individual house and the features associated with it are made explicit (SAP rating, CO2 emission, Energy Use etc)). Our GI Analyst has already facilitated this by preparing a configuration file for STEEV WFS in order to query individual buildings however there are interface and MapServer developer requirements.

A decision will be made shortly regarding developer resource in order to implement said changes.

As perparation gets underway for the forthcoming JISC Geo Tools day ((28/29 Nov.) STEEV have produced a postcard which will be distributed at the event and made available to our project partner at the Welsh School of Architecture for further outreach opportunities. Feel free to digitally send the postcard to interested colleagues.