STEEV Final Product Post

This blog post provides details about the web tool developed by the STEEV project.

Problem Space:

  • There is a requirement by the UK government to reduce the country’s carbon emission by 80% by 2050.
  • Buildings account for 45% of energy use in the UK, the equivalent of all transport and manufacturing combined (ESRC, 2009).
  • Most building stock which will exist in 2050 has already been built.
  • To achieve this target massive alterations of the current buildings are required. Part of the solution would be a tool that could enable planners, local authorities and government to best estimate the impact of policy changes and to target the interventions appropriately.

Cue  – the STEEV demonstrator, a stakeholder engagement tool developed to visualise spatio-temporal patterns of modeled energy use and efficiency outcomes for the period of 1990-2050 – http://steevsrv.edina.ac.uk/

For a portable overview of the project download the STEEV postcard

Primary Users:

Students, researchers, lecturers from a wide variety of disciplines/sub-disciplines, including geography, architecture, ecology, environmental science, economics, energy engineering and management.

The tool is also aimed at a range of stakeholders such as policy makers, urban developers, climate change specialists, carbon energy analysts, town planners.

Key Product Information – motivations and mechanisms

The STEEV demonstrator was developed to complement a larger project, Retrofit 2050 – Re-Engineering the City 2020-2050: Urban Foresight and Transition Management (EPSRC EP/I002162/1) which aims, through a range of stakeholders, to get a clearer understanding as to how urban transitions can be undertaken to achieve UK and international targets to reduce carbon emissions. The Retrofit 2050 project focuses on two large urban case study areas (Manchester and Neath/Port Talbot, South Wales – the latter being the focus of the STEEV demonstrator due to data availability within the project time-frame), through modelling scenarios of carbon emissions and energy use, both now and in the future.

The demonstrator itself is a client web application that enables researchers and stakeholders to look at how the spatial and temporal distribution of energy efficiency measures may impact upon likely regional outcomes for a given future state. This takes the form of a spatio-temporal exploration and visualisation tool for building-level energy efficiency modelling outputs such as the energy rating of the building, the likely energy demand of the building and the related CO2 emissions. A finite series of modelled scenario permutations have been ‘pre-built’ thus providing a limited number of parameters to be interactively altered in order to explore the spatio-temporal consequences of various policy measures.

View the STEEV Demonstrator Website: : http://steevsrv.edina.ac.uk/

Note: A further workpackage to establish a small area data viewer as part of the presentation layer will also be implemented shortly. This replaces the Memento geo-Timegate component of Workpackage 3.

The user interface has two main areas of activity, namely:

  • three ‘pre-built’ policy scenarios which depict government investment in energy efficiency measures (from best to worst case scenario) and a user generated scenario created by selecting a combination of the energy efficiency variables which go to make up the ‘pre-built’ scenarios.
  • a map viewer that enables model output values (SAP ratings, Energy use, CO2 emission) for each scenario to be viewed for each decade (1990 to 2050) at Output Area level of spatial granularity.

Further information about the policy-scenarios and variable descriptions are available from the help page

Fig1. – The STEEV Demonstrator

STEEV tool interface

Fig. 2. – Policy Scenario 2 – Low Carbon Reference

CO2 emissions, 2010 - Low carbon reference

Fig. 2 – Policy scenario 2 – Low Carbon Reference (i.e. the government invests in partial decarbonisation of the grid through reduced dependence on fossil fuels. Large investment in energy efficiency and small scale renewable, some change in occupant behaviour) has been selected for 2010. CO2 emissions have been chosen as model output value.

Fig. 3 – User-generated Scenario

Energy use for Custom Scenario 2020

Fig. 3 – A zoomed in view of a user-generated scenario for Energy Use for 2020. Note: User generated scenarios are forecast only.

Fig. 4 – Policy scenario 3 – Google Earth Time Slider

Energy efficiency data can be downloaded as Keyhole Markup Language (KML) files for use with the Google Earth Time Slider (for ‘pre-built’ scenarios only – see below) or as raw ASCII files complete with spatial reference for analysis in a Geographic Information System.

Energy Use policy scenario

Fig. 4 – KML files viewed on Google Earth for Energy Use output model values for policy scenario 3 – (i.e. the government invests in decarbonisation of the grid through renewable, nuclear, and huge investment in energy efficiency and small scale renewables. Large scale change in occupants behaviour)

Fig. 5 – Model output for individual buildings

Model output for individual buildings

Fig. 5 – Forecasted model output values (SAP rating, Energy use, CO2 emissions, CO2 emissions based on 1990 levels) for an individual building in 2030.

Note: Click on Blue dot and select Buildings map layer.

Engagement:
Members of the STEEV project presented at the following events:

  • STEEV / GECO Green Energy Tech Workshop at the Edinburgh Centre on Climate Change (13 October 2011) – for further details see blog post
  • Post-event comments include:

    “STEEV provides a new simple tool to quickly visualise a series of scenarios concerning energy consumption and carbon emissions within the complexities of the urban fabric. By facilitating the visual and historical understanding of these issues in a wider area, and for its forecasting capability considering a series of energy efficiency variables, it has a great potential to assist the planning and design processes.“ – Cristina Gonzalez-Longo (School of Architecture, University of Edinburgh)

    The STEEV system’s geospatial information on energy consumption and CO2 emissions can help planners and project developers target projects and initiatives related to energy efficiency and reduction of carbon emissions. Furthermore, the forecasting tools built into STEEV enables energy and carbon emissions to be estimated through to 2050 on the basis of alternative scenarios for energy efficiency initiatives, renewable energy, etc. This facility should help to determine where the opportunities for future emissions reductions will be, and the contributions made by existing policies and plans to future (e.g. 2020 and 2050) emissions reduction targets.” – Jim Hart (Business Manager, Edinburgh Centre for Carbon Innovation)

  • The Low Carbon Research Institute 3rd Annual Conference held at the National Museum of Wales on 15-16 November 2011
  • Post-Industrial Transformations – sharing knowledge and identifying opportunities, a two-day architectural symposium held at the Welsh School of Architecture on 22-23 November 2011

Technologies:
The STEEV demonstrator is a JavaScript client application which uses Open Layers as the mechanism for displaying the map data over the web. It also deploys a Web Map Service with temporal querying capabilities (WMS-T) to deliver Ordnance Survey open mapping products via the Digimap OpenStream API. The modelled energy efficiency variables are held in PostGIS (an open source spatial database extension to PostgreSQL)

Licences::
Data – Open Database License (ODC-ODbL) — “Attribution Share-Alike for data/databasesâ€�
Code – GNU General Public License version 3.0
Blog & other website content – Creative Commons Attribution 3.0 Unported License

Table of Contents of Blog Posts:

Project Logos:

combined logos of EDINA, JISC, WSA

Project Team:

STEEV Project Team

EDINA team members (L to R: Lasma Sietinsone, George Hamilton, Stuart Macdonald, Nicola Osborne. Fiona Hemsley-Flint is currently on maternity leave.)

Simon Lannon: Project partner from Welsh School of Architecture, Cardiff University:

Meeting with EDINA and DCC staff in Edinburgh

I was fortunate enough to have a meeting with some people from EDINA and the DCC in Edinburgh on Wednesday. The aim of the meeting was to get some input and advice from some experts on the ideas I have for a spatial data management best practice report.  So a big  thank you to Martin Donnelly of the Digital Curation Centre (DCC), James Reid, Stuart McDonald, Chris Higgins and Michael Koutroumpas from EDINA.

I had a long 7 hour train journey from Aberystwyth so my apologies for the overdose of PowerPoint slides that I had time to create before the meeting. It was extremely helpful to talk to experienced and knowledgeable  people about the direction the report, which is one of our outputs from the IGIBS project. My background in environmental science leaves a few significant gaps in my knowledge and, as Chris put it, “a sanity check” on my work was well worth the time needed to attend the meeting. I even had the opportunity for an evening walk on Arthur’s Seat and a lunch hour looking around Edinburgh as a bonus.

Some of the key advice from the meeting centered around the following; INSPIRE and how it will or wont impact on Universities,  insights into the not so obvious but very significant benefits of writing a data management plan and where it fits into good data management, some great pointers to other studies and sources of information that will feed into the report, the need to make the report easily accessible to its audience and some great institutional  case study examples from Australian through Californian to British Universities.

Another theme that emerged from the discussion was how INSPIRE and the need for good data management can be viewed as a threat but it is also a great opportunity for academic staff to gain easier access to the ever increasing amounts of spatial data being created around the Globe. A viewpoint that will help to make the report more appealing to time starved researchers.

We also had talk of semantics and just what do you call a spatial data infrastructure (if you don’t want to use SDI). It was suggested that UK Location has moved towards Location Information Infrastructure as a way of making an SDI label more intelligible to the uninitiated. I found this much more enlightening and useful that the recent update from UK Location on “Data Things” and abstracted “Data Objects”  but a few hours of digestion may make this a little more understandable to my irretrievably ecologically orientated mind.  It reminded me of some reading I had done about old Norse governance and how their aassembly was called the “Thing” and met in the “Thingstead”.  I remember thinking that they didn’t have a proper word for it so just called it the “Thing” but I guess that just shows how language develops over time and maybe we can look back to SDI in a few years with the benefit of a really useful label for it, whatever that may be.

As a result of the meeting I am re writing some sections I had drafted and adding some new summary sheets for subsections of the intended audience and more importantly I don’t feel like my original thinking was miles off the mark, just a bit  under-informed and lacking some focus.  So creating the rest of the report will also be made a little easier once I have digested the new material I have been pointed towards.

So thank you once more gentlemen and I look forward to meeting you again if the occasion arises.

Steve

Final blog post – Walking Through Time

Walking Through Time closing blog

This is a final closing blog post that summarises the achievements of JISC follow-on funding for Rapid Innovation – Benefits Realisation Small Project Funding.

The WTT project has taken many turns in its two year history and has forged many links that continue to demonstrate the rich nature of both the idea behind it and the collaboration that made it happen.

As of July 2011 the iPhone App has been downloaded over 9000 times and it continues to attract attention with limited marketing.

The App site is here:

http://www.walkingthroughtime.co.uk

The link the to App in the Apple store is here:

http://itunes.apple.com/gb/app/walking-through-time-edinburgh/id381528712?mt=8

The app was launched at the end of July 2010 in time for the Edinburgh International Festivals. At this time the project team had secured an agreement with the Landmark Information Group to allow free public access to their historical maps for a period of time that covered the festival. This made the free app very attractive because along with 3 maps from the National Library of Scotland, the app allowed multiple maps of Edinburgh.

In addition to the maps, two guided tours were included that featured audio files embedded within the app:

1. Margaret Stewart a historian at the Edinburgh College of Art provided a very personal narrative to the history of places surrounding the Royal Mile.

2. The Edinburgh World Heritage Trust recorded a selection of narratives by two Scottish actors to extend their popular House Histories trail.

The tours appear as trails that are linked between landmarks in the map. Each landmark is identified by a pin point in the map, and touching/clicking the pin give access to text and audio file (where available).

Gallery of working app (click here for larger images):

Education and Cultural impact

The app has made good impact into the education and cultural communities and WTT was presented through an invited lecture and accompanying workshop at the Digital Futures of Cultural Heritage Education symposium at the University of Edinburgh (DFCHE), March 2011. The DFCHE project was funded by the Royal Society of Edinburgh, and led in collaboration by the School of Education at the University of Edinburgh and the Royal Commission on the Ancient and Historical Monuments of Scotland and the National Galleries of Scotland. DFCHE had two specific aims: 1. To begin to establish a research agenda for museum and gallery education for the digital age, 2. To inform policy and practice in the use of social media and user-generated content by the Scottish cultural heritage sector.

Business / Spin out

Following the immediate success of the app during the Edinburgh Festival 2010 the project team began conversations with Landmark Information Group to develop a UK version of the app which would allow the public to walk/drive across the UK using an 1850 map. Particular excited by the prospect of finding out what was under the M1 as one drives to London, the idea, coupled with the exciting download statistics from the festival led to a series of conversations about a fully licensed product.

Landmark remain keen to develop a product, however concerns over the pricing framework have meant that discussions have since stalled. In order to justify the release of UK wide maps, the company would have to charge a significant price for the app (something in the region of £4 for a single map).

More recently following a presentation of the app at the Scottish Technology Showcase (7th June SECC, Glasgow) interest has now turned to developing international language versions of the app for Edinburgh Tourists. Since the app capitalises upon the free maps that are made available by the National Library of Scotland, and 60% of visitors to Edinburgh are international, developing foreign language versions may be better way of capitalising what the project team has achieved.

Invited Talks

The app has attracted a great deal of attention across academic communities for a number of reasons:

1. For the GIS community the very user centred approach in its simplicity as an iphone App has given the team access discussions about how GIS technologies can access new audiences.

2. The museum community has embraced the App as a model demonstrator of novel audience engagement that connects historical data with contemporary media.

3. The IT / HCI community enjoy it’s critical design approach – the turn toward using old maps as oppose to adopting new cutting edge technology.

These connections and interests have led to range of invited talks and presentations in which WTT was discussed in the context of multi-disciplinary production and agile development:

SACHI: the St Andrews Computer Human Interaction research group

St. Andrews University

Invited research seminar: 29th March 2011

http://sachi.cs.st-andrews.ac.uk/activities/seminars/

Learning Sciences Research Institute, Nottingham University

Invited research seminar: 11th January 2011

http://portal.lsri.nottingham.ac.uk/Seminars/Lists/Events/Archived%20Events.aspx

The Royal Commission on the Ancient and Historical Monuments of Scotland (RCAHMS), Edinburgh

Invited research seminar: 8th August 2011

http://www.rcahms.gov.uk

Conference presentations

The App has also been presented as part of a series of conference presentations and papers:

The Digital Landscape Architecture conference, at Anhalt University of Applied Sciences, Germany. 26th to 30th May, 2010.

Speed, C. and Southern, J. (2010) Handscapes – Reflecting upon the Use of Locative Media to Explore Landscapes.

http://www.kolleg.loel.hs-anhalt.de/landschaftsinformatik/436.html

Also published in the conference proceedings:

http://www.abebooks.co.uk/9783879074914/Peer-Reviewed-Proceedings-Digital-Landscape-3879074917/plp

MappingtheCITYinFILM a Geo-historical Analysis.

An International Interdisciplinary Conference
School of Architecture / School of Politics and Communication Studies

University of Liverpool. 24th -26th February 2010.

http://www.liv.ac.uk/lsa/cityinfilm/index.html

Speed, C. (2010) Walking Through Time: Use of Locative Media to Explore Historical Maps.

The Digital Humanities 2011 conference

Stanford University Library, Stanford. 19th – 22nd June 2011.

https://dh2011.stanford.edu/?page_id=3

Co-organised Panel: Virtual Cities/Digital Histories featuring papers by:

Robert C. Allen, Natasha Smith, Pamella Lach, Richard Marciano, Chris Speed, Todd Presner, Philip Ethington, David Shepard, Chien-Yi Hou, & Christopher Johanson

Speed, C. (2011) Walking Through Time and Tales of Things.

https://dh2011.stanford.edu/wp-content/uploads/2011/05/DH2011_BookOfAbs.pdf

The App was also presented the Scottish Technology Showcase, Scottish Exhibition and Conference Centre, Glasgow. 7th June 2011.

http://www.scottish-enterprise.com/microsites/technologyshowcase.aspx

Awards

ALISS (Association of Librarians and Information Professionals in the Social Sciences) is a not-for-profit unincorporated professional society. It is an independent group which was formed in April 2005 by the former committee of ASSIGN (Aslib Social Science Information Group and Network).

http://www.alissnet.org.uk/

Walking Through Time article / paper published in ALISS Quarterly was nominated for the first prize (£50)

Chris Speed Walking Through Time in Volume 5, no. 3 ISSN 17479258, April 2010 of ALISS Quarterly

http://www.alissnet.org.uk/uploadedFiles/Aliss_Quarterly/completeproofapril2010.pdf

Network Activity

Following the conference presentation at MappingtheCITYinFILM a Geo-historical Analysis in Liverpool in 2010, Speed was invited to consult and become a member of an AHRC/BT funded research network.

Through a series of meetings the network established a small but critical community who offered expert inquiry into cultural opportunities for GIS and new media to engage with historical documents / material.

http://www.liv.ac.uk/lsa/cityinfilm/intro.html

Chapters

Research from the WTT project has informed two book chapters:

Mapping Cultures, published by Palgrave Books

Edited by Les Roberts

Chris Speed: Walking Through Time: Use of Locative Media to Explore Historical Maps

Due early 2012

Heritage and Social Media: Understanding and Experiencing Heritage in a Participatory Culture, published by Routledge books

Edited by Elisa Giaccardi

Chris Speed: Mobile Ouija Boards

Due early 2010

Link to code repository or API:

Source Forge Site: http://sourceforge.net/projects/walkthrutime/

download the XCode project here

Project Team:

Chris Speed, c.speed@eca.ac.uk – Edinburgh College of Art

Ian Campbell, i.campbell@eca.ac.uk – Edinburgh College of Art

Karlyn Sutherland, karlyn_sutherland@hotmail.co.uk, Edinburgh College of Art

Dave Berry, Dave.Berry@ed.ac.uk – Information Systems, University of Edinburgh

Peter Pratt, Peter.Pratt@ed.ac.uk – Information Systems, University of Edinburgh

Petra Leimlehner, Petra.Leimlehner@ed.ac.uk – Information Systems, University of Edinburgh

Jeff Haywood, Jeff.Haywood@ed.ac.uk – Information Systems, University of Edinburgh

James Reid, James.Reid@ed.ac.uk – EDINA

Tim Urwin, T.Urwin@ed.ac.uk – EDINA

Project Website:
◦    http://walkingthroughtime.eca.ac.uk/

PIMS entry:
◦    https://pims.jisc.ac.uk/projects/view/1718


Final Product Post: Chalice: past places and use cases

This is our “final product post” as required by the #jiscexpo project guidelines. Image links somehow got broken, they are fixed now, please re-view.

Chalice – Past Places

Chalice is for anyone working with historic material – be that archives of records, objects, or ideas. Everything happens somewhere. We aimed to provide a historic place-name gazetteer covering a thousand years of history, linked to attestations in old texts and maps.

Place-name scholarship is fascinating; looking at names, a scholar can describe the lay of the land, see political developments. We would like to pursue further funding to work with the English Place-Name Survey on an expert-crowdsourced service consuming the other 80+ volumes and extracting the detailed information – etymology, field-names.

Linked to other archival sources, the place-name record has the potential to reveal connections between them, and in turn feed into deeper coverage in the place-name survey.

There is a Past Places browser to help illustrate the data and provide a Linked Data view of the data.

Stuart Dunn did a series of interviews and case studies with different archival sources, making suggestions for integration. The report on our use case for the Clergy of the Church of England Database may be found here; and that on our study of the Victoria County History is here. We also have valuable discussions with the Archaeology Data Service, which were reported in a previous post.

Rather than a classical ‘user needs’ approach, targeting groups such as historians, linguists and indeed place-name scholars, it was decided to look in detail at other digital resources containing reference material. This allowed us to start considering various ways in which a digitized, linkable EPNS could be automatically related to such resources. The problems are not only the ones we anticipated, of usability and semantic crossover between the placename variants listed in EPNS and elsewhere; but also ones of data structure, domain terminology and the relationship of secondary references acorss such corpora. We hope these considerations will help inform future development of placename digitization.

Project blog

This covers the work of the four partners in the project.

CeRch at KCL developed use cases through interviews with maintainers of different historic sources. There are blog descriptions of conversations with:

LTG did some visualisations for these use cases, and more seriously text mining the semi-structured text of different sample volumes of the English Place Name Survey.

The extraction of corrected text from previously digitised pages was done by CDDA in Belfast. There is a blog report on the final quality of the work, however the full resulting text is not open licensed nor distributed through Chalice.

EDINA took care of project management and software development. We used the opportunity to try out a Scrum-style “sprint” way of working with a larger team.

TOC to project blog –here is an Atom feed of all the project blog posts and they should be categorised / describe project partners

Project tag: chaliced

Full project name: Connecting Historical Authorities with Links, Contexts and Entities

Short description: Creating and re-using a linked data historic gazetteer through text mining.

Longer description:Text mining volumes of the English Place Name Survey to produce a Linked Data historic gazetteer for areas of England, which can then be used to improve the quality of georeferencing other archives. The gazetteer is linked to other placename sources on the Linked Data web via geonames.org and Ordnance Survey Open Data. Intensive user engagement with archive projects that can benefit from the open data gazetteer and open source text mining tools.

Key deliverables: Open source tools for text mining archives; Linked open data gazetteer, searchable through JISC’s Unlock service; studies of further integration potential.

Lead Institution: University of Edinburgh

Person responsible for documentation: Jo Walsh

Project Team: EDINA: Jo Walsh (Project Manager), Joe Vernon (Software Developer), Jackie Clark (UI design), David Richmond (Infrastructure), CDDA: Paul Ell (WP1 Coordinator), Elaine Yates (Administration), David Hardy (Technician), Karleigh Kelso (Clerical), LTG: Claire Grover (Senior Researcher), Kate Byrne (Researcher), Richard Tobin (Researcher), CeRch: Stuart Dunn (WP3 Coordinator).

Project partners and roles: Centre for Data Digitisation and Analysis, Belfast – preparing digitised text, Centre for e-Research, Kings College London – user engagement and dissemination, Language Technology Group, School of Informatics, Edinburgh – text mining research and tools.

This is the Chalice project blog and you can follow an Atom feed of blog posts (there are more to come).

The code produced during the Chalice project is free software; it is available under the GNU Affero GPL v3 license. You can get the code from our project sourceforge repository. The text mining code is available from LTG – please contact Claire Grover for a distribution…

The Linked Data created by text mining volumes of the English Place Name Survey – mostly covering Cheshire – is available under the
Open Database License – a share-alike license for data by Open Data Commons.
.

The contents of this blog itself are available under a Creative Commons Attribution-ShareAlike 3.0 Unported license.

 

CC-BY-SA GNU Affero GPL v3 license. Affero GPL v3

Link to technical instructional documentation

Project started: July 15th 2010
Project ended: April 30th 2011
Project budget: £68054


Chalice was supported by JISC as a project in its #jiscexpo programme. See its PIMS project management record for information about where responsibility fits in at JISC.

Talk to us about JISC 06/11

Glad to hear that Unlock has been cited in the JISC 06/11 “eContent Capital” call for proposals.

The Unlock team would be very happy to help anyone fit a beneficial use of Unlock into their project proposal. This could feature the Unlock Places place-name and feature search; and/or the Unlock Text geoparser service which extracts place-names from text and tries to find their locations.

One could use Unlock Text to create Linked Data links to geonames.org or Ordnance Survey Open Data. Or use Unlock Places to find the locations of postcodes; or find places within a given county or constituency…

Please drop an email jo.walsh@ed.ac.uk or look up metazool on Skype or Twitter to chat about how Unlock fits with your proposal for JISC 06/11 …

Unlock in use

It would be great to hear from people about how they are using the Unlock place search services. So you’re encouraged to contact us and tell us how you’re making use of Unlock and what you want out of the service.
screenshots from Molly, Georeferencer
Here are some of the projects and services we’ve heard about that are making interesting use of Unlock in research applications.

The Molly project based at University of Oxford provides an open source mobile location portal service designed for campuses. Molly uses some Cloudmade services and employs Unlock for postcode searching.

Georeferencer.org uses Unlock Places to search old maps. The service is used by National Library of Scotland Map Library and other national libraries in Europe.
More on the use of Unlock Places by georeferencer.org.

CASOS at CMU has been experimenting the Unlock Text service to geolocate social network information.

The Open Fieldwork project has been georeferencing educational resources: “In exploring how we could dynamically position links to fieldwork OER on a map, based on the location where the fieldwork takes place, one approach might be to resolve a position from the resource description or text in the resource. The OF project tried out the EDINA Unlock service – it looks like it could be very useful.”

We had several interesting entries to 2010′s dev8d developer challenge using Unlock:

Embedded GIS-lite Reporting Widget:
Duncan Davidson, Informatics Ventures, University of Edinburgh
“Adding data tables to content management systems and spreadsheet software packages is a fairly simple process, but statistics are easier to understand when the data is visual. Our widget takes geographic data – in this instance data on Scottish councils – passes it through EDINA’s API and then produces coordinates which are mapped onto Google. The end result is an annotated map which makes the data easier to access.”

Geoprints, which also works with the Yahoo Placemaker API, by
Marcus Ramsden at Southampton University.
“Geoprints is a plugin for EPrints. You can upload a pdf, Word document or Powerpoint file, and it will extract the plain text and send it to the EDINA API. GeoPrints uses the API will pull out the locations from that data and send it to the database. Those locations will then be plotted onto a map, which is a better interface for exploring documents.”

Point data in mashups: moving away from pushpins in maps:
Aidan Slingsby, City University London
“Displaying point data as density estimation services, chi surfaces and ‘tagmaps’. Using British placenames classified by generic form and linguistic origin, accessed through the Unlock Places API.”

The dev8d programme for 2011 is being finalised at the moment and should be published soon; the event this year runs over two days, and should definitely be worth attending for developers working in, or near, education and research.

CHALICE: Our Budget

This is the last of the seven blog posts we were asked to complete as participants in a #jiscexpo project. I like the process. This is a generalised version of our project budget. More than half goes to the preparation and annotation of digitised text from scans, both manually and using named entity recognition tools.

The other half is for software development and user engagement; hoping to work together closely here. Of course we hope to over-deliver. Also have a small amount allocated to have people travel to a workshop. There’s another, independently supported JISC workshop planned to happen at EPNS on September 3rd.

Institution Apr10– Mar11
EDINA National Datacentre, University of Edinburgh (project management, design, software development) £21129
Language Technology Group, School of Informatics, University of Edinburgh (text mining archival work, named entity recognition toolkit development) £19198
Centre for Data Digitisation and Analysis, Queens College Belfast (preparation of corrected digitised texts for use in archival text mining – the EPNS in a set schedule of volumes) £15362
Centre for e-Research, Kings College London (backup project management, user needs and use case gathering, interviews, dissemination) £12365
Amount Requested from JISC £68054

CHALICE: Team Formation and Community Engagement

Institutional and Collective Benefits describes who, at an institutional level, is engaged with the CHALICE project. We have three work packages split across four institutions – the Centre for Data Digitisation and Analysis at Queens University Belfast; the Language Technology Group at the School of Informatics, and the EDINA National Datacentre, both at the University of Edinburgh; and the Centre for e-Research at Kings College, London.

The Chalice team page contains more detailed biographical data about the researchers, developers, technicians and project managers involved in putting the project together.

The community engagement aspect of CHALICE will focus on gathering requirements from the academic community on how a linked data gazetteer would be most useful in to historical research projects concerned with different time periods. Semi-structured interviews will be conducted with relevant projects, and the researchers involved will be invited to critically review existing gazetteer services, such as geonames, with a view to identifying how they would could get the most out of such a service. This will apply the same principles, based loosely on the  methodology employed by the TEXTvre project. The project will also seek to engage with providers of services and resources. CHALICE will be able to enhance such resources, but also link them together: in particular the project will collaborate with services funded by JISC to gather evidence as to how these services could make use of the gazetteer .  A rapid analysis of the information gathered will be prepared, and a report published within six months of the project’s start date.

When a first iteration of the system is available, we will revisit these projects, and  develop brief case studies that illustrate practical instances of how the resource can be used.

The evidence base thus produced will substantially inform design of the user interface and the scoping and implementation of its functionalities.

Gathering this information will be the responsibility of project staff at CeRch.

We would love to be more specific about exactly which archive projects will yield to CHALICE at this point; but a lot will depend both on the spatial focus of the gazetteer, and the investigation and outreach during the course of the project. So we have a half dozen candidates in mind right now, but the detailed conversations and investigations will have to wait some months… see the next post on the project plan describing when and how things will happen.

CHALICE: The Plan of Work

DRAFT

GANTT-like chart showing the interconnection between different work packages and participants in CHALICE – not a very high-quality scan, sorry. When there are shifts and revisions in the workplan, Jo will rub out the pencil markings and scan the chart in again, but clearer this time.

As far as software development goes we aspire to do a Scrum though given the resources available it will be more of a Scrum-but. Depending how many people we can get to Scrum, we may have to compress the development schedule in the centre – spike one week, deliver the next pretty much – then have an extended maintenance and integration period with just one engineer involved.

The preparation of structured versions of digitised text with markup of extracted entities will be more of a long slog, but perhaps I can ask CDDA and LTG to write something about their methodologies.

The use case gathering and user engagement parts of the project will develop on the form already used in the TextVRE project.


CHALICE: Open Licensing

Software

We commit to making source code produced as part of CHALICE available under a free software license – specifically, the GNU Affero General Public License Version 3. This is the license that was suggested to the Unlock service during consultation with OSS Watch, the open source advisory service for UK research.

GPL is a ShareAlike kind of license, implying that if someone adapts and extends the CHALICE code for use in a project or service, they should make their enhancements available to others. The Affero flavour of GPLv3 invokes the ShareAlike clause if the software is used over a network.

Data

We plan to use the Open Database License from Open Data Commons to publish the data structures extracted from EPNS – and other sources where we have the freedom to do this. ODbL is a ShareAlike license for data – the OpenStreetmap project is moving to use this license, which is especially relevant to geographic factual data.

As far as we know this will be the first time ODbL has been used for a research project of this kind – if there are other examples, would love to hear about them. We’ll seek advice from JISC Legal and from the Edinburgh Research and Innovation office legal service, as to the applicability of ODbL to research data, just to be sure.