GeoForum 2020

We are very pleased to confirm a new date and venue for our next Geoforum event and invite you to join us for an interesting programme with a range of speakers and opportunities for discussion. Geoforum will be on Thursday 12th December 2019 at the Informatics Forum, University of Edinburgh 10.30am – 4pm. Please do […]

Geoforum 2016 Summary

EDINA’s annual Geoforum conference for all its geospatial services and projects was held at the University of Edinburgh’s Institute of Geography this year. It was attended by nearly 50 delegates who came to find out what we have been up to over the past year and to see what we new things they can expect […]

GeoForum 2016: Booking Now Open

We are now taking bookings for EDINA’s GeoForum 2016 with this year’s event being held a the University of Edinburgh on the 7th of September.

kim traynor [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons

kim traynor [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons

Reserve your place now:

GeoForum 2016 Booking

GeoForum is a free all day event aimed at lecturers, researchers and support staff who promote and support the use of geospatial data and services at their institution. Throughout the day we there will be talks and demonstrations to inform you of current geospatial developments at EDINA and the wider community. It is also an opportunity to give EDINA feedback on the services we provide and discuss geospatial issues with the team.

Full details of this years event will and the programme will appear on the website when available:

GeoForum 2016

This year we will be introducing some changes to the geospatial data services offered by EDINA to the academic community. These include new Ordnance Survey data products and updated licence agreements for most of the Digimap Collections.  We also hope to present some case studies from staff and students who have been using data from Digimap and the other geospatial services from EDINA.

The conference will be located in the University of Edinburgh’s geography department on Drummond Street.  We will also be highlighting what we have done over the summer to improve Digimap.

The conference is free to attend and runs from 10:00 till 16:15, for all the details and to book your place please visit the conference website: GeoForum 2016

Please contact us if you have any questions:

  • Email: edina@ed.ac.uk

Find out what happened at last year’s event: GeoForum 2015

FacebookTwitterGoogle+LinkedInEmailShare/Bookmark

GeoForum 2015: Booking Now Open

University of Greenwich Queen Anne Court

University of Greenwich Queen Anne Court

We are now taking bookings for EDINA’s GeoForum 2015, with this year’s event being held at the University of Greenwich on the 16th June.

Reserve your place now: GeoForum 2015 Booking

GeoForum is a free all day event aimed at lecturers, researchers and support staff who promote and support the use of geospatial data and services at their institution. Throughout the day we there will be talks and demonstrations to inform you of current geospatial developments at EDINA and the wider community. It is also an opportunity to give EDINA feedback on the services we provide and discuss geospatial issues with the team.

Full details of this years event will and the programme will appear on the website when available:

GeoForum 2015

This year Ordnance Survey’s cartography team will be telling us about how they go about creating cartographic representations of the map data.  We’ll also be showcasing the work of students of architecture and urban design, highlighting how data from Digimap is crucial to their studies.

The conference will be located in the historical Queen Anne Court at the University of Greenwich, part of the Old Royal Naval College. In such surroundings we hope the weather will allow us to go outside for an afternoon excursion where we can show you the enhancements to our Fieldtrip mobile app while exploring the area. We will also be highlighting the many new datasets that have been added to Digimap over the recent months and giving you a sneak preview of what we will be doing over the summer to improve all the geoservices offered by EDINA.

The conference is free to attend and runs from 10:00 till 16:15, for all the details and to book your place please visit the conference website: GeoForum 2015

Please contact us if you have any questions:

  • Email: edina@ed.ac.uk
  • Phone: 0131 650 3302

Find out what happened at last year’s event: GeoForum 2014

Share

Geoforum 2014: Live Blog

GeoForum 2014 (#geoforum2014) takes place at the Edinburgh University Informatics Forum from 10am until 4.15pm today. Throughout the event we will be liveblogging so, whether you are able to join us or not, we suggest you bookmark this post and take a look late tomorrow morning for notes from Peter Gibb’s keynote. Keep an eye on the post throughout the day as it will be updated after every session. We also welcome your comments (below) whether during or after the event.

As this is a liveblog there may be occasional spelling or typing errors. We also welcome additional links, resources, or any corrections to the text if you spot them. 

Welcome – Helen Aiton, EDINA

I’m Helen Aiton, User Support Team Manager at EDINA and it is my pleasure to welcome you to the EDINA Geoforum 2014. We hope you will have a very enjoyable and interesting day. We have lovely weather today for our citizen science excursion. We also want to thank our data providers who are here today, and who you will have an opportunity to speak to during lunch and during our breaks. With that I will hand over to Tom Armitage to introduce our keynote speaker.

Tom: Our Keynote Speaker is Peter Gibbs, BBC Broadcast Metearologist. We booked a weather forecaster and they brought the lovely weather with them! We’ll do that again! Peter will be talking today about flooding.

Keynote Speaker: Peter Gibbs, BBC Broadcast Meteorologist

When Tom first asked me to give this talk he mentioned that hydrology data has been added to Digimap and I thought, what would be appropriate to that? Well I thought something that shows several databases and data sources being brought together would be appropriate, so…

Let me take you back to 2007, summertime in England and Wales. There was severe rain, several people lost their lives, 55,000 homes were flooded. And the government came in for a bit of stick so they set up an inquiry. This was completed in a year ago. The Pitt Review recommended that the Environment Agency and the Met Office should work together. Previously had been a bit of a dichotomy. At the Met Office we predicted the weather, once it had happened we washed our hands of it. And it worked the other way too… it wasn’t exactly joined up. So the recommendation was that this joint grouping should issue warnings together and to do so at a lower threshold – to allow for better preparedness, which might mean false alarms but also better safety and more time to plan.

So the Flood Forecasting Centre was set up in 2009 and it’s a new set of experts basically. Meteorologists have been trained in hydrology, hydrologists have been trained in meteorology and that team works together and operates 24/7.

So, we predict the weather… but weather is a messy thing. This is difficult science. Lets look at a shot of data from 4th February 2009 based on radar data. In weather terms a block of major storms appears here. It’s not very big… but it happens to be on a hill between Exeter and St Austell. And on that hill the rain fell on snow. That caused traffic chaos. But only because it was both intense rainfall, in that spot, at that moment, at that temperature… that complexity is the stuff weather forecasters have to do all the time. And we are increasingly asked to not just predict the weather but also predict the impact and that can be really difficult to do.

So, how do you actually do a weather forecast? Well you start with what’s happening now. Observations are key to setting up a good weather forecast. Surface observations, taken by people on the ground, are an important part of this. Coverage around the world are various… on land in Western Europe is pretty good, Africa less so, high altitudes less so, and over the oceans also big gaps. So you can see already that it’s an incomplete picture.

Radar is good, provides ground truths, very useful but tends to only be over land areas and radar is expensive. We also have a network of upper air measurements via weather balloons… but that network is even more sparse. It’s sending up a packet of electronics once or twice a day that you never get back! And we do also get observations from aircraft. Almost every commercial flight gathers windspeed, direction, temperature, and send that back. It does cover the Ocean but still more Northern than Southern hemisphere. About 130,000 observations a day though so very useful.

Weather satellites cover a great deal of the gaps now though, several geostationary satellites are the real workhorses for weather data here. The location/proximity to the equator varies, but we get great imagery that provides useful parameters on a grided basis. You can get a profile all the way down the atmosphere – not as good as balloons – but still very good especially for sparsely populated areas. And weather forecasters are increasingly data hungry…

Now all that data means there is an awful lot of number crunching to be done. The Met Office has been at the forefront of this, more to do though. But we have a “Unified Model” allowing us to do a full forecast of the globe. That’s important to do for forecasting. Can take that model down to lower areas, we can nest models, but it lets us generate weather forecasts at lots of levels from that model. Now the model data has to all be gridded. The grid ding is getting smaller and smaller as more sophisticated computer modelling lets us reach greater and greater detail. Not long ago our grid resolution was 40km. So for the UK, that’s not that many grid squares. A high resolution 4km model, which we use now, is far more accurate and far more like the mapping you’d see in your atlas.

The model we use for the weather forecasting and for broadcasting is the 4k model. It’s 1.5 x 1.5 km squares across the UK. But you come across a problem. Because weather is global, you have to have stuff happening at the edges, to feed coming weather into your model… weird things happening if you don’t do that and it won’t be that useful. So it’s kind of blended… we have lower resolution at the edges, then more detail at the centre. But we are talking about a three dimensional atmosphere. That’s the horizontal grid area. We have 38 vertical grid levels above that. You can see how hefty the computing power needed is for this sort of forecasting – bringing in that level of modelling detail along with thermodynamic calculations.

So, that model generates a model predicting weather… but you have to be careful in how you interpret the signal from the noise – an increasing challenge as data detail becomes more and more high resolution. There is always a learning process whenever we change to a new data model, and a new data resolution.

It’s not just weather that goes into this model. Other things have to be brought in. Soil type data has to be brought in. Sandy soil heats quickly in the day, cools quickly at night, transmits heather. A more clay soil will be slower to heat or cool for instance. Soil moisture content will also make a big difference. More moist soil slows heating and provides more moisture for clouds. Vegetatian type matters a lot, especially at mid level where we are. And vegetation type changes hugely over the year. And that coverage hugely changes solar radiation and temperature. Sea surface temperature is very important for tropical cyclones. Sea-ice thickness matters a lot – more white ice and there will be more reflection of solar energy. Thickness changes warmth of ocean as well as reflectiveness. Sea surface currents also matter – for transporting warm or cool currents. Some of this data is based on observations, some are based on average annual variation.

There is one big enemy of weather forecasters… and that’s chaos. In theory if you had a perfect representation of the atmosphere and a perfect model then your forecast could be perfect. Now that perfect representation and model cannot be totally perfect, always lots of variation. So, to deal with variation we use something called “ensemble forecast” – making many different forecasts and comparing them effectively.

So, lets take a one day model – a “spaghetti chart” – this shows changes building up… and when we do that ensemble forecasting – and we compare these on a “postage stamp” print out – we can start to see the variance across the forecasts. Can talk in terms of probability… so here we could see northwesterly winds, chilly… Sometimes these all look totally different then the weather is really changeable and unpredictable and that’s useful to know. If they are all very similar we can have a lot of confidence in the prediction. That confidence rating on the forecast is a really useful thing. So here is we look at a “Whisker diagram” we can again see this idea of comparison and confidence in the predictions.

So how does that forecasting translate to the work at the Flood Forecasting Centre? Well we take that forecast, put into a hydrology model with grid to grid modelling. There are four types of flooding: fluvial flooding – rivers overflowing; pluvial flooding – rapid downfall that drainage can’t cope, dips in road, “flash flooding”; coastal flooding; and groundwater flooding – tends to be delayed reaction, saw a lot of that in winter flooding we’ve just had… where water moves through the system and causes belated effects such as underground flooding.

So, for instance, Cumbria 2009. We take the weather model. Data comes into the grid to grid model… each grid points takes water falling out of the sky, The model has relief, river flow, soil moisture, etc. data. And so it tracks where water falling will land, where it will go next etc… water starts to fall and flow into river systems, heading off down out to sea… this is shown in Cumets(?) per second. It’s reasonably high resolution data. That’s then converted into return period flows (QT grids). This is “above 2 yr flood”, “above 100 yr flood” etc. And from that we then get a flood guidance statement. But that’s where human intervention becomes so important, interpreting that guidance statement, and understanding impact, infrastructure.

So in terms of assessing the flood risk for Lowestoft as an example. This uses ensemble forecasting again. We see at 5 days here some uncertainty. At 4 day forecast that variation is more varied. And then 2 days ahead all ensemble forecasts come close together – something big is on the way. The big uncertainty there was how the flood defences would hold up. Defences had been built after last major flood in 1953 but not tested. They did hold. Everyone wondered what all the fuss was about!

Just to finish off… what the Flood Forecasting Centre is moving towards is much more focussed assessment of output. So when my son graduated in Durham there was huge rainfall and flooding… A Victorian wall fell in that rain. The ultimate aim is to be able to run predictions that go to that sort of level – this road, this area, will be at risk. In shorter term forecasts it might be about getting an hour or so warning with radar help. But we are moving in that direction.

Q&A

Q: You talked about the 1km accuracy is way off. What holds that back? Number crunching? Data available?

A: Largely number crunching. The Met Office just had another super computer upgrade but it becomes harder to justify the substantial expense without this sort of clear impact of data. But there is also the issue of how accurate it gets at that level of data. So we are at a sort of holding point? Some say higher and higher resolution is better, some say more ensemble predicting is better.

Q: When you put up grid for the UK with 4×4 at outside, why not take more data from Atlantic as so much of our weather from. And is there thought about variable grids – as such variable impacts.

A: Well we do have edges of edges – at the edges of the 4×4 is the global pattern. But you can only run so much computing power across that grid. All about competing factors. Have to have best possible solution. On the other question part of the ensemble effort is about what the crucial areas in a forecast are, that impact the weather downstream, and you can focus on those to get better quality forecasts for the UK. People are looking for this.

Q: I’m impressed by the increased accuracy of the weather. Is it about data accuracy, better models?

A: All of the factors I talked about: observations; modelling; data; and also the research that goes into our forecast models. And it is so much better than it was. I’ve been weather forecasting for 30 years. When I started 24 hours ahead was good, now 5 days ahead is pretty standard.

Q: How important in climate change in those flooding models

A: Angharad is probably the person to answer that – we can talk more later – but obviously the theory is that a warmer atmosphere means more moisture in the atmosphere, more rainfall and more flooding.

And after a lovely break, featuring caramel wafer,

Using EDINA Datasets in a hydrology project: Darius Bazazi, Hydrological Assessment of Natural Flood Management

I work as a spatial data analyst at GeoPlace in London. GeoPlace is owned by the Ordnance Survey and we provide definitive national address and street gazetteers (datasets) – essentially we do what EDINA do for the Higher Education sector, but for the public sector.  So the AddressBase features point data for every address in Great Britain – will full postal address and a unique property reference number – allowing tracking across the lifecycle of a property.

So, a quick example of how Address data is used in Flood Management… we have zones that show high, less high and low risk. So for this, for instance, allow emergency services to target resources in flood events. And also the property data lets us estimate more specific risk. Why does this matter? Well when you don’t have good data… Thames Water was fined £86 million for overestimating flood risk for instance.

So, today I’ll be talking about hydrology and talking mainly about maps. I know we have a room full of experts here so I hope you’ll pick me up on anything I get wrong! So, I used three EDINA datasets in the project I’m talking about – Land Cover Maps 2007, Digital Terrain Model (DTM), and some OS? mapping. I really enjoyed using the EDINA Digimap and the GUI there.

The project was looking at soil moisture and to calculate a regression-based equation for PROPWET (soil moisture). Currently engineers use a constant value for catchment PROPWET (Flood Estimation Handbook) so we wanted to test and develop a regression based model. And GIS is a good tool for that because we can bring in so many variables etc.

Natural Flood Management involves using nature to mitigate flooding – currently difficult to analyse that. But this equation sort of feeds into that. Looking at how we can vary different ground conditions in the modelling. For instance NFM is used at Eddleston Water in the Scottish Highlands. There is 69 square KM waterway that is part of the River Tweed Catchment. So, PROPWET measures the proportion of time that soil is wet. Value is currently taken from the Flood Estimation Handbook. But that Handbook is quite old and gives quite static measures of moisture.

Looking at the river network in schotland, and the river network monitoring stations, we can see how data is gathered and what it’s coverage is. Using a Halcrow Analysis we can see variation of data across Scotland. It’s a pretty good but we wanted to broaden that analysis across Scotland. So, the Land Cover Map 2007 was really interesting to work with. I obtained vector and 1km raster. My desktop GIS couldn’t handle the scale of the vector mapping so I had to use the raster mapping (which is based on the vector). This covered Scotland across river catchment. LCM2007 is a parcel-based classification of 23 land cover types.

So I took that raster data. I generated some catchment. You can get them from the Centre for Hydrology and Meteorology but we thought it might be interesting to generate our own catchments. So this we took the Digital Terrain Model and made some transformations to produce catchment delineation. I wanted to get areas within in 10% accuracy, only got to 17%. So I decided to use the Centre for Hydrology and Meteorology shape files instead. And then I did some geo processing, using ArcGIS python model builder to iterate zonal stats maps. And looking at potential evapo-temperature models. And looking at Lead Area Index (vegetation cover) and Surface Resistance.

The result was an equation that is appropriate to Scotland but, had this not been a university MSc project it would be nice to expand the model and cover more of the UK.

The OS have a new dedicated water product, a new Water Layer (beta) which has been created with the Environment Agency, SEPA, INSPIRE. And that will have huge potential I think.

Q&A

Q: How did you choose the size and location of the catchments? How do they choose their catchments?

A: I did used the Centre for Hydrology and Meteorology catchment areas in the end. I’m not sure how they select their catchments, an interesting question though.

Digimap Support: Carol Blackwood, EDINA

I will be talking both about what we have happening at the moment and what developments are coming up.

Firstly, we have a new Digimap Registration System coming soon. Currently users come in, they need to register before accessing the data. We ask you what you want to use, who you are, and we give you a lot of legal terms, we ask for lots of information and ask you what collections you want access to, you see a summary and submit that… then you have to wait. About 24 hours but up to 48 hours. Currently we have to eyeball every single registration, manually approve it, we run a script overnight, and then you get access.

So, for services these days that’s rather below expectations… we are used to giving information up but you are used to then getting access after a few minutes. And we know some students don’t remember to register until quite last minute. Their expectations are high and for fast registration. And our new licence has enabled a few changes… so we have been working on a new registration system.

As we began to redesign the process we have also been using “user stories” – pathways that users will take as they encounter our system. We had some requirements too. We took the user stories and some of our requirements to work out our priorities… what MUST we do, what SHOULD we do, what COULD we do… ?

So, the new system will be a lot simple. A new user will come in and login via Shibboleth credentials. You will register with your user details. You will get an email to validate your email address. And once that is validated you will be able to activate access to collections – choosing selections, agreeing to terms, and defining the purpose you want the data for. And then you will have access. This process should be 10 minutes ish, not 24 or 48 hours. This should much better meet user expectations, including those who have registered a wee bit late for their deadlines!

When we roll out the system we won’t have any change for existing users – no need to register again. We will renew activations after 12 months – a new change but you won’t need all the initial data entry again. It will be very simple, no interruptions. And we are adding a new registration for Historic Collection users – that’s so we understand who is using that service, so that we can support them better. Again it will be quick and painless.

Clearly this is a big change for us. We are anticipating it around early September 2014. To ensure a smooth transition we will provide help pages, how to guides, videos, webinars, and anything else that would be helpful – let us know what they might be? We can provide text for handbooks for instance. Just let us know. But this will be lovely. It will be simple, easy to use, and much quicker. And it benefits us as at peak time our help desk has to focus heavily on registration at the moment, they will be freed up at those peak time for other support activity.

The other part of my talk is about some recent updates. The first of these is out new Live Chat function – you can find this from our Help Centre, and you’ll be familiar with this sort of functionality from other websites but basically it lets you ask live help questions via text chat. So come and chat with us, and let your users know they can ask us questions this way.

We have also been creating a whole new series of online videos – for specific tasks and tools – as well as number of new service demos. Both of these sit within our help centre. The demos help explain the interface and how to get started – can be really useful to promote or support users in getting started with Digimap.

We have also been creating some information for you to take and use on your own local pages: descriptions of all the collections, logos for Digimap, sample images, textual information. This is always the definitive source of up to date information on the services – you can always come here to get text and resources for your own use in webpages, documents, etc. So please do have a look and make use of this.

And we still have our traditional in-person training courses – in Derby, Oban, Bradford, Swansea, Southampton, London, Newcastle. If you’d like to host one just let us know and we’ll be happy to work with you, to let you know our requirements etc. We have also been out doing Digimap workshops tailored for history students; Digimap and GIS workshop for earth and environmental science MSc students, etc. If there is something tailored that you would like, just let us know and we will be happy to help.

We have also been out and about running workshops: British Cartographic Society Map Curators workshop; Enhancing Field Learning; Research Data Management for geo data, etc. And our colleague Tony Mathys has been running metadata workshops and giving a paper at GEOMED 2013. If you are interested in metadata, Tony is your man, just get in touch.

We are also running an increasing number of Webinars, run by our colleague Viv Carr, and these are very flexible 1-2 hr sessions usually, easier to fit into a working day often. We have a series of these on Digimap Collections, Roam, Historic Digimap, etc. And we also take requests, so just get in touch!

And, if you want to find out where we will be do take a look at the new Events page on our blog, keep an eye on the EDINA Events page – where training sessions and webinars are advertised – and keep an eye on GoGeo for more on Tony’s Metadata workshops.

And… a final note on what’s coming shortly… our engineers are currently working on Site Rep Stats for all collections, as well as the next run of annual calculation of data values used by your institution – and there will be more information about that.

So that was a very quick round up of what’s happening. Basically, if there is anything we can do to help just get in touch!

Q&A

Comment: Just to reiterate something you said. The tailored session on GIS for history students that you did for my masters students was just brilliant. I would recommend that. It was great for my students to see their subject from that different geospatial perspective, really useful for their understanding. And there are a lot of really good resources there for those wanting to carry on that learning.

And now time for lunch including Service demonstrations; Project demonstrations; Meet the map providers; Share your Vox Pops on Digimap.

 

During the next sessions the group will be splitting into two groups. One (those with green dots on their badges) will start with the Citizen science in your pocket: Excursion, the other will start with the Digimap Update session. This blog will, however, be sticking with the Digimap Update session throughout the afternoon.

 

Digimap Update – Guy McGarva

I will be covering what we’ve done in the last year, what we’ve got coming up and what we will be doing in the future.

We have been listening to your feedback, through surveys, support requests, discussions – including chats at training events, events like today, etc. and these feed into our Quality Improvement Plan.

Since the last GeoForum we have launched a new collection, Environment Digimap. This is data from CEH and includes Land Cover data from CEH for 1990, 2000, 2007. We have made improvements to all of the Roam mapping clients within Digimap. Roughly a year ago we highlighted that these were coming, these have now been implemented. There is a new interface for all roams so that they are now cleaner, more consistent interfaces. They are all essentially the same with minor changes for different collections. This makes updating, maintenance, easier but also makes them more consistent for use and exploration. Whether coincidence or not we have had greatly higher usage over the last year. We had well over 1 million screen maps generated in the last year – we must be doing something right! We have also seen general usage on the rise, downloads too.

And we have also made significant changes to the print interface in all Roams based on feedback we’ve had from you, from users. So part of that is the layout review – seeing the area you are printing – no more wasted tie creating PDFs of the wrong area – we’ve seen PDF generation go down despite usage going up, we think that’s because users are getting the right map first time much more often. We now have variable scale printing in all Roam except Historic. You can choose the exact scale you want of your map and can print at any size from A4 to A0, whether PDF, JPEG, PNG. And you can now print overlapping maps in Ancient Roam. Some fairly significant improvements there.

We have also improved search. It doesn’t mean much visible change but we now include street names and partial postcodes in searches – providing more options for exploring the data.

And finally… we now have printable Legends. You can create a printable PDF legends in all Roams except Historic. And Geology Legends are now Dynamic – i.e. they only shows features on the map. And we might, in future, that concept of dynamic legends to other areas of Digimap. The printable legends come as PDFs, you can add them to your map, etc. So that’s what we’ve added in the Roam client.

Over the year we are trying to migrate all of Digimap to the new Data Download Clients, based on the OS Download client. We started by launching a new Geology Download – this went live in August 2013. And we have added new data: Northern Ireland Land Cover Data from CEH and more Geology data from BGS, including flooding and permeability. And whenever we make updates to data, or to interfaces we post these on our blog – any big change will appear that.

One of the most significant changes has been that, for a long time, OS MasterMap data was only available in GML format.. that’s how we get it from Ordnance Survey but it’s not easy to use in GIS programmes – it always required some sort of processing or conversions. For some users that was almost impossible, we has a lot of questions from CAD users about this, and generally this format issue was particularly difficult on a Mac. So, we now have a way to convert data to DWG on the fly. This has already been implemented and makes life a lot easier, particularly for CAD users as most CAD systems will support DWG files. Feedback on this has been really good, it lets people use data right away rather than having to make their own conversions.

That’s what has happened. We are now working on providing MasterMap as File Geodatabase – that will make it easier to load into ArcGIS and QGIS. And we are using FME for on-the-fly conversion.We won’t be doing shape files, as the shape files generated from this data are too massive to process hence using File Geodatabase. That will be coming soon. Also Ordnance Survey have data on building heights (alpha version). This is a very new thing, an alpha product and a first release… but we want to make it available to users. We will be taking that data on MasterMap building heights and providing that, initially at least, as KMLs – and you can see a demo of that in our demo area today. That’s all buildings across all of the UK, and that data should be really interesting, particularly for architects. Eventually that data will just become part of MasterMap.

And we are keeping on adding new Basemaps in Roam. These are variants at a particular scale or different data set. This gives you the option of different Basemaps at different scales. So VectorMap District 0 raster and vector versions will be available as alternatives to 50K. And we will be replacing OS 10K Raster with VML Raster – that’s because OS are withdrawing 10k Raster later this year. And we will hopefully also be adding 3 versions of Miniscale too.

We have new improvements coming to Marine, a fairly major update in fact. We will be getting new High Res Bathymetry data for Marine – now 30m resolution rather than 180m, so that’s a really massive difference. And we will be getting updates to the Hydrospatial and Raster Chart data. And there will be a new Data Download Client – in the same style as the other Digimap Collections.

We have had a project with the National Library of Scotland – who digitised new metadata for all the historic maps we had. That’s resulting in improved metadata for Historic data in England and Wales, and improved dates for publication and survey (Scotland already has it). And, again, we will have the New Historic Download client.

Still to come…

We are working on some updates to the Data Download basket – to make it easier and clearer to select different formats and dates, which isn’t hugely easy to find right now. When we get a new data set we don’t just delete the old data… you can go back and select last years data or data from two years ago – you can get versions of the data as well.

We have some small changes to make to the Digimap Home Page. Because we have rationalised our clients we don’t need so many expanding menus etc, just make it much more tablet friendly. And, as mentioned earlier, we will also have Site Rep Stats for all Collections – so as a site rep you will be able to see data on how Digimap is being used in/by your institution. We would love your feedback on any of these aspects, and we will continue to inform you of changes like this through the Digimap blog.

And one thing we hadn’t really mentioned yet…

We have a new service coming later this year is Digimap for Colleges. This will be based on the Digimap for Schools functionality – which means simpler somewhat stripped down functionality, and a Roam-style mapping client. This bridges the gap between schools and university, particularly useful for those doing A levels for instance. And it means FE institutions will be able to use Digimap without needing shibboleth support – but FE institutions will be able to choose between Digimap for Colleges or Digimap for HE. So that will be coming pretty soon.

We are also working on other geospatial stuff. We are working with the UK Data Service Census Support – which will become “Open” shortly meaning no login will be required to download most of the data. Some data sets will still require login, but most will become open. That’s all Census data, Scottish Census data, etc.

We have also making changes and enhancements to other geo services. Unlock has been expanded to include historic English place-names, data generated through the DEEP project, and improvements made to the API to allows you to search for these old place names. And GoGeo will have added language conversion support, support for metadata standard Gemini 2.2, and updating GeoDoc resources, and also a mobile version of the site.

In addition to our services we are also working on a lot of different geo projects at the moment. COBWEB is a major EU-funded project with partners across Europe and involves citizen science in Biosphere Reserves. We have the Palimpsest and Trading Consequences text mining related projects, and the Spatial Memories project. A lot of these are based around, or include development of, mobile apps.

In terms of our longer term aims we have been doing a lot of work to integrate geo services together more – including Unlock, FieldTrip GB etc. We have been making more use of available data feeds – many suppliers providing WMS now and we are trying to think about how we can use that more and most effectively. We want to make user workflows simpler and more complete. to provide solutions to common problems to do with the data life cycle. Improve utility and value of services. And improving our support resources and capabilities.

So, that’s where we’ve been, where we are going. What do you want to see next? We would welcome your comments, ideas, any wish list items about data, functionality or support that you or your users may want. Let us know now or just get in touch after today…

Q&A

Q: Are there any enhancements planned for the usability of Roam – zoom, pan, selection boxes? Are you moving to a better internet-based software for that or will you be sticking with what you have now?

A – Tim: We have a project to look at this. We are planning to make it tablet friendly as that is clearly needed.

A – Guy: It’s the data too. We use Open Layer 2 toolkit, there is an Open Layer 3 version coming soon so we can see what becomes possible then.

Q: You said you can select dates for data download. One significant issue in Digimap is lack of earlier maps?

A – Tim: We have LandLine from 1995, and MasterMap from 2006/7. We hope to make that and earlier data available. We’ve been doing a lot of work to make that happen but it is more of a licensing and usability, rather than a technical challenge.

Q: Will KMLs of buildings be textured?

A – Guy: No, right now we have OS point and building heights as a CSV and we will be extruding that data to create the buildings KML, maybe other formats. Once building height information is out of alpha, OS are saying that it will be part of MasterMap as an attribute. In the meantime we’ll supply that data as the CSV but also in KMLs.

Q: Anything else happening in OpenStream?

A – Guy: Only to update the data. But we also want to look at what levels of WMS we can provide. We are certainly thinking about that sort of data and technology can be used more widely.

Tom: Do give us feedback on anything you’d like to see added – an email at any time is fine.

Q: Are there plans for other Ordnance Survey data sets to be added?

A – Guy: we are still working with OS on the new license. We ware still waiting to see what OS can make available, what we can get it for. If there are particular data sets you would like to see added do let us know and we can look into it. But if it’s not available yet, you can approach OS. And letting OS know you want some data will help encourage them to think about it being added to the data available in Digimap.

Q: I have been looking at OpenStreetMap – any plans to provide, say, Shapefile downloads from OpenStreetMap?

A – Guy: We are always looking at that type of thing. If there is demand that could be really interesting, particularly for non UK mapping.

Q: For some of the Marine Stuff, stuff across the North Sea would be really helpful…

A – Guy: the new data we have is better in terms of extent, especially the charts.

Q: With the enhancements being made to Roam, when you can print to A0, what’s happening with Carto? It used to be that Carto and Classic were very different. You used to be able to customise the maps in Carto.

A – Guy: You can do that in Roam.

A – Tom: But you can’t merge products as easily… but many of the vector map products integrate more of the products you used to have to combine. And for things like Contours we are adding those as an overlay to make it easier to print.

A – Guy: We are trying to ensure functionality of Carto is in Roam, but easier to use. And Roam has a lot of additional functionality, particularly around Legends, that neither product would have had before.

And now we will have a break. Look out for tweets from the FieldTrip GB Excursion (#geoforum2014) and join us on the blog for the second round of Q&A from the next Digimap Update session, later this afternoon.

Q&A – Second Session

Q: Have you considered European mapping?

A – Tom: It’s something that we are being asked for more and more, particularly as we have more non geographers using Digimap. Let us know if you have people come and ask about this. Other countries have good mapping agencies – Denmark for instance.

A – Guy: We are certainly getting queries for data in Germany, France, etc. And it might not be about us providing the data but being able to let the user know where the data is, how to access it, if it’s available for free.

Comment: There is a European data set, could help.

A – Guy: This can be tricky as often it’s multiple data sets with different licenses. But if you know of suitable data. When we got marine data here we did find out about other marine resources and we’ve made the links to those available – we are happy to consider adding data but we can always link to data from elsewhere too. I didn’t mention ShareGeo but that allows us to share ad hoc data sets… sometimes we create data sets and share it there as we know its tricky to obtain from other places.

Comment: I went to OS for data for my MSc – wanted data at about five levels and the 0.5 m level.

A – Guy: If you do find data sets we don’t have then do let us know – and the more people who go to the data supplier requesting a data set, the more easily we can show demand there from the sector.

Comment: I looked through ShareGeo for some data – and found some really useful stuff there. I’d recommend looking through that but also sharing your own data there.

A – Guy: We do create a lot of the data sets there, in response to queries, when we find useful data, etc. But we would love to get ShareGeo used more, to get it out into the mainstream. Would be great to get a WMS feed generating from the data there too – Shape files are useful but…

Q: Is there any chance of renegotiating for the imagery?

A – Guy: No prospect at the moment. We may try again. It is OS imagery. But there are other sources that can be looked at. And the spot imagery that will become available later this year should be open source.

Comment – Angharad Stone, CEH: We’d like to get the environmental Lidar and imagery data available – not full coverage but good quality. In the meantime you can use it free for academic use but you’d need to come to us (CEH) directly.

Closing Remarks – Tom Armitage

Firstly we would like to thank all of our speakers, all of the suppliers for providing great demos. Thank you for some fantastic questions from all of you and all of your continued support for Digimap. There will probably be a Geoforum next year, possibly further south, so keep an eye on the blog for more information as always. And all of today’s presentations will be available via the Digimap blog and possibly elsewhere on the EDINA website too.

And, with that, we close the event. Thanks for following along on the blog!

Share

GeoForum 2013 LiveBlog

GeoForum 2013 takes place at the Congress Centre in London from 10am until 4.15pm tomorrow. Throughout the day we will be liveblogging so, whether you are able to join us or not, we suggest you bookmark this post (link here) and take a look late tomorrow morning for notes from Shelley Mosco’s keynote. Keep an eye on the same post throughout the day as it will be updated after every session. We also welcome your comments (below) whether during or after the event.

You can also take part in GeoForum 2013 via our Twitter hashtag, #geoforum2013, where you are welcome to comment, contribute and engage with the Digimap team and our GeoForum attendees. We will also be tweeting key updates, images and notes from the event so if you don’t already follow @EDINA_Digimap, now’s the time to do it!

Please Note: This is a live blog so please do excuse any typos, spelling issues, etc. and do let us know if you have any corrections, clarifications, or information to add – we’ll be happy to update the post accordingly. 

Welcome – Emma Diffley, EDINA

Emma is welcoming our attendees to GeoForum 2013. We hold a GeoForum regularly but not every year so we are delighted to be holding this event this year but the next GeoForum may not be in 12 months time. Since we last met we have had a busy year. And we have been watching closely the changing funding and financial landscape as well as the changing strucure of Jisc. There is lots coming in the future and we’ll be showing you some of that today.

This afternoon we have an outdoor excursion, it’s a bit weather dependent, but either way there will be an interactive activity with FieldTripGB – even if it has to be indoors!

Keynote: Digimap Data and a Non-traditional Perspective – Shelley Mosco (with Robert Park and David Parfitt)

Shelley Mosco is a practising landscape architect and senior lecturer in GIS at the University of Greenwich in the School of Architecture, Design and Construction. She is also involved in research of living walls and green roofs as part of the school’s Sustainable Landscapes Research Group.

I am so happy to see you all here – I know that some of you have come a very long way to get here today – I’ve probably actually come the shortest distance! I will be presenting with my colleagues Rob and Dave, who will be showing you some of their work.

What I hope to achieve here this morning is that I know there are several people here from library services… I know some of you are keen to hear a bit more about how you can help the students find out what could be done with Digimap, how it is used in practice. And Rob and David will show you how they have been using Digimap. What I’m going to talk about first though is how landscape architecture uses GIS and Digimap – we are something of a non traditional discipline – and then show you some of those specific projects, and to introduce what’s yet to come. And a little bit on our new school building with nearly 4000 square metres of roof space for gardens, agroponics, etc. and we will be looking at collaboration, sponsorship etc. for use of that roof space.

I’ve mentioned a little about famous lanscape architects. Ian McHarg came from Scotland originally, and he eventually moved to America teaching at Harvard. We think of him as the grandaddy of GIS. He looked in the 1960s at the landscape as a whole, really connecting across to other disciplines. What he did, in addition to teaching, was he had his own practice. He was asked to build a new highway to New York. He had about 200 criteria to think about. He used a thing called “SIM” analysis. This is pre-GIS, this is all manual work with maps, felt pens – colouring in shades of grey or black for inappropriate areas. After 200 or 300 odd layers/overlays – and that layering is something that he, Jack Dangermond, and Carl Steinitz are particularly known for. So these origins of GIS and overlays actually come from a group of Landscape Architects.

So, if we watch a short video clip of Jack Dangermond receiving his Lifetime Achievement award from ESRI, we hear his description of these roots of map overlays with Ian McHarg. And Ian McHarg’s very lively speech on those origins.

At Greenwich, how it works, we taught using Ian McHarg’s examples as well as the work of Jack Dangermold and Carl Steiniz. That’s the place our thinking comes from. And that idea of layers, of finding the right location through shading and colouring. We’ve taught GIS to landscape architects since 1995 – when via DOS prompts. In 1996 there was a reaccredition and it was decided that landscape architects needed GIS the same way that we need AutoCAD. Now I don’t teach the SIM approach, I come at this from the perspective of a practicising landscape architect. It is about practical projects, wherever it is in the world, and they have to use GIS for that. GIS is a way to look at time, place and patterns. For me GIS is particularly powerful for looking at those patterns, for finding them. You can do that with SIM analysis but GIS allows you to do this in lots of ways quickly.

So looking at various views from Digimap here – we look at Base Plans, Topographical Analysis, 3D Data ArcScene) and Data Analysis. Using all these views allow us to find the best options, the best plans here. We have some of the best practicising GIS specialists at Greenwich, and we are trying to set up a Centre for GIS excellence – watch this space!

So, David and Rob will show you their projects shortly but I wanted to show you some of the best student projects from the last few years.

So this is Zoe Antonald’s work, looking at creating an Oxbow habitat around the O2 arena area. She’s used a vast amount of data, including historical data, to identify the best site for the Oxbow and created a 3D model and flythrough.

James Penney’s project looks at non permeable areas and ways of creating attenuation zones to reduce flood risk.

Paul Hadley looked at the “Boris Island” airport in London. He looked at the Norman Foster design but critiqued it from a landscape architecture perspective, particularly the maintenance of the special area of wildfoul and wetlands in the proposed area. He has identified where the site boundary should be, he’s looked at contours and topography, and he’s looked at some of the historical data for surface water flow accumulation. So where the waste water area occurs that should be more wetland than that. And proposed a new use of space and way to fit the airport into the environment.

Joe Perkins looked at ways to transform the Valley Gardens in Brighton and done an assessment of the site based on what it is and what it could be, based on Yann Sizeman’s work. So he has looked at why it was set up like that. He has located it towards heaviest footfall, also to traffic. He has looked at crossover of pedestrians and traffic – identifying why the green areas were not being used as much as they could.

Jiamiao Xu looked at Vauxhall Pleasure Gardens. Looking back to it’s past as open fields, and planning a nice open space. Using hydrology to inform Sustainable Urban Drainage and Attenuation plan – as part of his master plan for that space. And looking at sightlines at present and how this could be changed to creating more land form, more water, more biodiversity. And he also looked at access to the site. Looking at cycling distance, location of river boats, wanting everything within a 5 minute walking zone.

So… what’s new?

BIM. BIM is new for landscape architects – bit more established for architects. So, what is it? Building Information Modelling. One of the definitions (from Landscape Institute) defines it as building a virual digital information 3D model, rich in data that can inform the decision making pricess and answer quesions throughout the entire project lifecycle, implemented in a collaborative environment. It is about that Ian McHarg idea of collaboration and about sharing data across the full life of a project. As a landscape architect you understand that, for instance, for a tree the root system will spread and go underneath paving etc. Understanding that paving, what may happen, how that relates to other systems etc. is really exciting. From 2016 it is obligatory that all publicly funded projects must use the BIM approach. We have already had an email from the Institute for Landscape Architects to say that practices are already using BIM in private practices and want to recruit students who understand and are trained in BIM. And that is likely to therefore be crucial for accrediting degrees. The time is now for incorporating BIM.

Just to explain a bit more about BIM. You can take the same approach to, say, planning lunch. There are 38 different models for this. You have a Project, an Assembly, Materials. This is about planning with different permutations. This is the same concept – picking your bread, your sandwich fillings etc – are the same for architects and landscape architecture. This is mandatory information that must be provided at each stage. The Cabinet Office has created a template/package for this – COBie: Construction Operations Building information exchange. For me the tools for BIM will be GIS of course.

Now over to one of my students: David Parfitt.

My project is based around the Wandsworth area and the area around the River Wandsworth post industriliasation. There have been some projects to improve the landscape but I wanted to create a masterplan to connect and join these areas together. I used digimap to find the location of hard features, to look for opportunities. I had only used GIS for 12 weeks but I was able to use historic Digimap data to identify a large area of marshland that had been lost over the last 150 years or so. There is one tiny area of wetland recently introduced but overall it has been a huge loss of that type of landscape over time.

I also used topographical data to look at potential improvements, using contour mapping for flood prevention for instance. And making changes to the landscape to make it easier to engage and interact with the area. There is still wildlife present in the river so I wanted to focus on the water and look at incorporating that into the design at multiple levels. Looking at increasing the space for wildlife, but also for people to use, and to allow the river to expand in times of flooding. And I was able to create a masterplan using that GIS data and AutoCAD to propose new design aspects.

And finally to another of my students: Rob Park

I have been at the landscape architecture course at Greenwich for the last year. It’s been a very steep learning curve with GIS. If you’d asked me what GIS is even 6 months ago I wouldn’t know. But I have learned how to use it, and I have enjoyed using it, and I’ll be showing you how central MasterMap data has been to creating this design. My own design is looking at redevelopment of an MOD site around the South Thames Estuary and Marshes (SSI). Indeed there are many SSI’s around the site, particularly close to that area of redevelopment. I spoke to the RSPB and they explained that one of the key SSI area issues is around habitats for the Nightingale, which is endangered and does have key nest sites here. So I had two choices. Suggest that no development takes place… or come up with a plan.

I used Mastermap to find out more about the area. I wanted to properly survey the site but being a former MOD site it’s surrounded by fences and I wasn’t allowed on to take photographs. So with that limited access I began to fall in love with GIS. I found 30 different types of habitat, and used that data to see which of those habitats is most useful to the Nightingale for nesting. Red are habitats the Nightingale depends on, Orange and Yellow are supporting habitats – so some development would be possible there if sensitive. That leaves large white areas on the map for development. And I wanted to create a sustainable community, including buffer zones around houses to mitigate their impact.

I didn’t have a clear idea of topography but ArcScene let me get some idea of that with the elevation data, to fly through that data. I also plotted the water accumulation data to see where flooding and water occur. 3D is great but for masterplanning the 2D data is really useful. I used some tools in Arc Map programme to look at that water accumulation, and what would be needed. This is drainage not based on pipes in the ground, but based on topographical features. I began to think about how to increase wetland, particularly wet woodland, the habitat most in decline in the UK. I also used data on the degree of slope and direction faced for planning planting and development.

So in a rapid process GIS lets me get a handle on issues pertaining to the site and to formulate a strategy, even without access to the site. And thus created a masterplan for a Zero Energy Development on the Hoo Peninsula.

Whilst I was working on the project Natural England deemed the site a triple SI. I felt sort of exonerated as the data I used from EDINA led me to the right sort of conclusion here.

Back to Shelley

I do hope that this gave you an idea of how Digimap helps us as Landscape Architects, and how GIS is central to that.

Q&A

Q – Karl Hennermann from Manchester) I think Shelley’s comment about BIM – that’s very relevant. We hear from engineering companies how crucial it is for graduates to have BIM skills. There is a BS standard out there that includes those BIM requirements for 2016. We are struggling with how to teach this to our students, there are software products – do you have any particular recommendations.

A – Shelley) I would like to join you in trying to figure this out. We need to learn more about how to integrate with other disciplines, about how we bring this into our own course. I know the COBie sheet will be the starting point for us. Using that spreadsheet, then bringing that into GIS. I’m not sure exactly how we will do this but I know that everything to do with BIM fluctuates every day – new ways to do things, new instructions. I think we will be running the whole time to get something up and running for our classes. If you do not think that this will affect you I would urge you think again. *Any* discipline relating to buildings, the built environment or the landscape you will be working with people like us, with engineers etc. and you will need to understand BIM.

Comment – Carl) There is an AGI special interest group on BIM. How will EDINA fit into BIM here?

A – Emma) We will have to do something, but we will need to investigate further. However if you have ideas about what we can do to help, what we need to support you, then we very much welcome them.

And over to Emma for a thank you.

Open & “Free” Geo Software and Data – Tom Armitage, EDINA

Tom will be talking about various bits of open source and free data and tools which may be of interest and of use with the tools and data we provide. We are aware that everyone’s budget is tight so hopefully all free resources are helpful. This will be an overview of what’s out there and what we recommend of these. 

So, firstly, why bother with open source? You may think it’s liable to break, that it’s flakey, that it’s just for nerds… but there are key reasons you should be taking note. The quality of free tools and data is vastly improved. When you think about the data there are no restrictions which can be particularly useful off campus, for commercial use – no need to go out and learn a new GIS system if you are using the same tools that can be used for commercial purposes – or for web publishing. Demand is changing too, industry has noticed that open source is important and they want students to have skills in programmes such as Quantum GIS. And cost wise? Well it is “free” but you have to train, to maintain, to support the use. But without that payment for a licence there is a significant cost benefit.

I will be talking about OpenSource, Free and Fremium. What do these terms mean? OpenSource is free to download, use and develop. Code or raw data is available to update or augment. Free tends to be free to download and use. Code or souce material is not available. Fremium is about free to download and use but with added functionality or material available at some additional cost.

In terms of Desktop GIS the key OpenSource systems are Quantum GIS (QGIS) – you do see this on job ads – and gvSIG. In terms of free tools AutoCAD Map 3D is free for academic use (only). This is more a GIS package. There is also AutoCAD Civil. Both of those systems are PC only though. In terms of Freemium you have things like ArcGIS layer viewer – but for full GIS stuff you need to pay for the full product.

Quantum GIS is free. It’s about the only GIS that runs on Mac. There is a huge user community with very active online forums and many people developing plugins and add ons. Version 2 – launching soon – will see parity in functionality in ArcGIS. And it is the one industry are using and asking for. QGIS will work with any database (Postress, MYSQL, ODBC, Esri, etc.). And Ordnance Survey are now producing styling for their own data – they now provide Style Layer Descriptor (SLD) Files and they work with OpenSource GIS software. They are not quite as good for symbology as some of the proprietary systems but it’s really good.

Some other notable players here in the open source GIS world include gvSIG – which has excellent foreign language support. There is also a mobile version. uDig is very easy to use but limited in functionality, and it hasn’t been updated for a while. And GRASS, the original open source GIS, is increadibly powerful but command line run. However someone has created a plugin for QGIS which is a very powerful and usable combination.

If we think about Cloud based GIS there are two options here for data visualisation and sharing. Google Earth Engine and ArcGIS Online. ArcGIS online allows you to create maps online, add lots of data from ESRI Microsoft OS, upload your own data, and share your maps. In the free service all maps you create are open for anyone to view. The Paid service is included in the top level CHEST deal and at this level you can keep your uploaded data private.

Google Earth Engine works very similar. There is a little additional analysis possible via Google Fusion Tables. The premium version allows you to host this locally (which is ok for Ordnance Survey data as long as it stays offline).

Warning: Ordnance Survey do not allow their licensed data to be uploaded to cloud services, particularly MasterMap data. Even if you are creating private maps with these services.

There are also various Web Mapping options out there. These are for displaying your maps online rather than perform analysis. Some will allow editing and data creation. MapBox is a Freemium service build on Leaflet; there is Leaflet itself – it is easy to use and very light weight but not very flexible, you have to use Web Mercator projection and only used GeoJSON or Native Layers; OpenLayers is very powerful and allows you to create interactive maps, but you need to know what you are doing and write the pages yourself – but there is a big user community out there to engage with, and you can use data from any WMS including the OpenStream; MapServer similarly. To give you a sense of just how powerful OpenLayers and MapServer are, they are the backbone of the Digimap Roam service.

So moving onto Data we have several levels here as well. In terms of OpenSource Data we are really talking about OpenStreetMap. In terms of Open Data there is OS OpenData, BGS OpenGeoscience, ShareGeoOpen etc. For Freemium the data includes Bing maps etc – where the data is good but usage etc. limited.

There are pros and cons to many of these data sets. OpenStreetMap is really good – even better in urban environments. But even the gaps are becoming less and less important as the data improves. But there is a lack of consistency here when compared to licensed data. But this type of data is called for more and more all the time – the point data in OpenStreetMap is particularly useful if available in your students’ area of interest.

The OS OpenData covers multiple Ordnance Survey data sets. Indeed 8 of the 14 views in Digimap Roam use open data. Data Download makes OS OpenData easier to use. OpenStream lets you pipe it direct into your web app or GIS. It’s good data and we provide a more usable route into that data for anyone with an @…ac.uk email address.

The BGS have taken a different approach – smallest scale data is available for download with larger scale data in online viewers. And we layer that data on OS OpenStream background maps rather than Google, Bing or similar.

ShareGeoOpen is a collection of 214 Open Datasets – mainly UK based. We would really recommend you contributing or encouraging students to contribute data there for reuse – particularly useful for identifying previous work and avoiding rerunning the same project multiple times.

Geospatial data isn’t, however, all about maps. Most data has an element of geospatial data or can be georeferenced – postcodes, place name, location steamp, IP addres for instance all have a geospatial element. Unlock Places is a way to put a point or polygon to virtually any data that has a spatial element, with global coverage (though better in the UK). And Unlock text lets you dig out geospatial references within text materials.

More information. OSGeo runs the FOSS4G Conference, as well as Grass, Quantum GIS and PostGIS. GoGeo lists most major free software resources. And EloGeo at Nottingham provides learning resources for GIS tools including

Q&A

Q) Is there a button in Roam to press to let students switch just to open data?

A – Tom) Not quite but the more information or “i” button will indicate the data set and license conditions. We could probably improve how we indicate that.

A – Emma) We have also considered running a whole separate Roam just for open data but we would need a business model for that. If you think that would be useful or have any comments or ideas around that please do come and talk to us.

Q) EDINA is an institution that farms out a whole series of products. There are lots of others as 4G comes together, as these things converge? I’m quite new to this so wondering where do you see this going? Where do you see the industry going?

A – Emma) We are a Jisc funded organisation so much of what we do is aligned with their vision, and we are clear that we produce tools for academic use. So we are not as much focused on public domain/freemium products for a wider audience but actually that is a tricky question as it’s not clear exactly where everything will be converging…

A – Tom) But you will see some of that work for the current/near future in our FieldTripGB demo later.

A – Addy) ArcGIS are pushing for GIS in the cloud. Ten years ago we wouldn’t have thought of Google as key mapping providers, now they are also providing GIS in the cloud. Those players are emerging. Roam is in the cloud – which seems to where the sector is going – but we could look to doing more processing in the cloud.

Comment – Kamie Kitmitto, Mimas) We have GRASS training materials in EloGeo as well.

EDINA GeoServices Review – Emma Diffley

At the moment you will be aware that we provide Digimap, GoGeo, geodoc, ShareGeo, Digimap for Schools – literally this is Digimap for Schools, Unlock, agcensus, and the UK Data Service Census Support is the new name for the data we used to know as UK Borders.

The first bit of big news is that we will be withdrawing Digimap Carto on 31st July 2013. It was launched in 1996. We haven’t been able to keep Carto working well and, meanwhile, we have taken the best of Carto and taken it into Roam, which is much easier to use. Things like overlays, being able to print to A0, being able to print to your choice of scale. You can now do these things in Roam – in terms of scale there are some limitations but we now support most formats or you can download the data and print yourself on larger/other fomats.

We have made some enhancements to Roam – Annotations: Save, Open, Import and Export (to various formats). You now have your own maps area making them much easier to find and access what you have already created. Basemaps is another new thing. If you are familiar with the data there are various flavours of basemaps – and you can now pick between any of these that are available at a given zoom level. We have also added printing from Ancient Roam to PDF.  And it is now the case that all Roams will support (currently in Ordnance Survey only) from A4 to A0.

In future for all Roams we are trying to make a cleaner interface, keeping it consistent across all Roams. There will be a convergence of look and feel across all Collections. There will be better printing controls. And there will be simpler, combined map and annotation Save/Open between Roams (e.g. accessing the same selected area). And there are additional overlay options coming soon (boundaries, contours…).

We have also made some data improvements. We have the VectorMap Local Shapefiles – we’ve converted all tiles to Shapefile and layer files for symbolisation are in production. VectorMap Local DWG will be available which will be good news to AutoCAD users. VectorMap Local is almost but not quite as detailed as MasterMap. All tiles have been converted to georeferenced DWG (with some styling) and will be available very very soon. OS MasterMap ITN will see improved performance with reduced file sizes. OS MasterMap Topo – DWG planned her soon. The DWG release will be a beta release really – we’ve done a lot of work to get this right but we are looking for feedback on these. We’ve made a first step on this though.

We’ve also been making some further tweaks to data. Strategi data has some potential for styling – with features not to be shown at smaller scale, e.g. Tourist symbols. And with the Meridean 2 and LandForm Panorama data we have done VML styling to make it more accessible and easy to use.

We have been harmonising Data Download so that all OS data is provided through one client encompassing MasterMap Download and Boundary Download (those separate downloaders have therefore been withdrawn), including all OS Open Data and Digimap Licened data. And we have the new Geology Download coming soon, more to follow.

We have also made some Help and Support Enhancements. We have a new Resource Centre, which does not require login, with answers to questions, videos, case studies etc.

On GoGeo we have added more resources and highlighting of the “Editor’s Picks”. You can search for data among 20,000 records from data providers around the world. Workshop resources introduce the importance of metadata. GoGeo also searches ShareGeo and ShareGeoOpen for data. ShareGeo Open includes over 210 resources all of which are open and free to use. You can share your data here and then cite the URI in publications, use data in projects/research/teaching contexts. That data gets used, seen, reused.

FieldTrip GB is a mobile for capturing data – you will get to try that out later!

GeoTagger is a tool to allow you to edit the metadata for your photos – which can then be useful for research or for use in sharing sites. Usually that will be about adding location data, perhaps occasionally you may want to remove that data. The idea is that you can tag images of your fieldtrip for instance.

Cartogrammar is another project, the idea is that you upload your data and it generates a different interpretation and visualisation of your data. It’s a fun thing to play with and explore your data in different ways.

UK Borders is no longer UK Borders. It is now the UK Data Service Census Support Unit. Underneath it’s the same data and code but it looks different on top. There is a link from the UK Data Service Census Support Unit website. All applications now in the UK Data Service style. Functions and data remain unchanged. Training now available via webinars. All support enquiries to go via online enquiry page – those eventually come to us but mean the UK Data Services are able to track those requests and support. There are some aspects of the Census programme as a whole are still in flux but that UK Borders data is still there for now.

So, what’s on the horizon? Well we are still awaiting news on funding. There are still lots of things we would like to do. If there are things you would like us to do we really want to hear about them as your recommendations carry real weight for us. And we are keen to hear our ideas on new data. We are looking to “mobilise” more. And we will be continuing to add new support materials.

Q&A

Q – Shelley) Can you say a big more about agcensus

A) It’s actually one of our oldest services. It provides agricultural census data – things like number of sheep, cowd, grain etc. It’s simple data but useful. It comes from

Q – Shelley) Could you use this for teaching animal management – how many sheep on a chalk bank in a particular area for instance?

A – Tom) It would be amount of meat harvested per square metre. Would that be appropriate? And there is a long history of this type of data, and long term data.

A – Shelley) That would be useful, particularly for comparing rural agricultural yields with urban yields for instance.

Comment – Emma) It is a subscription service. There is an institutional price or there are variations – institutions, personal, project and one-time subscriptions are available.

Q – Shelley) Can you say a bit more about bringing in environment data in Digimap?

A) We have undertook to include land cover data into our services. But the financial landscape changed since we took that decision. The data is ready to go but we are waiting for an official go ahead for releasing that.

A – Shelley) It’s data that we need in our work and our department for sure so we’d love to see that in there.

Comment – Tom) I’d love to hear more about the printing interface – what you and your students might like to see. Should it show less? Should it show more? It’s so hard to get right. That preview needs to look like what will be printed – but it can’t be perfectly the same so what makes the most sense and is most usable?

Comment – Emma) We need area, paper size and scale to all feed in here. There are limitations between those factors. People want to see exactly the area at the sane scale – but you can’t get A0 full size on a screen. You can’t show the extent and scale on the same image. We did think about tabs for different views but that’s not ideal either. Really you cannot have a WYSIWYG interface – you will get what you request but we can’t display it in the preview appropriately.

Comment) On most print previews you see the relationship between the preview and the page – can you do that?

A – Tom) The thing is there is no way to do that correctly unless you generate the PDF – which will take the same amount of time to generate the preview as to generate a PDF. But there’s no limit to how many PDFs you can create!

A – Emma) We have an issue with Digimap and Digimap for Schools both here – a nervousness that PDF is the same as “print”, people reluctant to generate PDFs. But we need to think about that.

Comment) How about just generating the PDF and showing that as the preview

A – Tom) That would be great but presupposes the user has Adobe Acrobat

Q) Is there anyway to make that connection between Roam and Data Download for OS Maps seamless?

A – Emma) The way that data download works is much more like that. We are not quite there yet but getting there.

Comment) When you purchase a map it appears to work in that way – it appears to work in that way, you click the picture and order the DWG.

A – Tom) We did wonder if providing DWG as a print format might be appropriate. They are things we are considering.

Q – Kami) How do you think about things like WMS rather than data download? Instead of download the accessing the data over the web via your GIS or tool. This is the general trend of Google, ESRI, etc. Probably makes sense to move towards that. And some industries may value those skills.

Comment – Shelley) I think that would be a lot easier actually.

A – Emma) We do have OpenStream… Data providers like this format and there is a move towards streaming data rather than downloads. We are working on this stuff.

FieldTrip GB Excursion – led by Addy Pope

This will be a slightly briefer live blog section as this is a hands on session trialling FieldTrip GB across the streets of Bloomsbury. Pictures illustrating the session will follow however. 

Today we will be using FieldTrip GB to collect data and then we will come back here and visualise this data.

FieldTrip GB lets you collect georeferenced data – whether you have a data connection or whether you are somewhere remote where you will not be able to upload data until you return. FieldTrip GB works with a DropBox account to allow you to upload and share data. The mapping used is OS OpenData, OpenStreetMaps, and other open sources data all combined together, optimised for 4 inch screen and with rural coverage carefully considered. There is one mapping stack to make things easy and friendly across disciplines. This was a challenge as detail in urban and rural areas varies. But we have used Mapserver 6.2 Masking. It works brilliantly except when you are on a boundary between areas. It works well for most people but it’s unfortunate if your mapping area is always on the boundaries of those maps.

You can select an area and download data ahead of a fieldtrip. At the moment there is a limit of 3 x 15MB downloads – enough data for most of the greater London area or the whole of the Lake District. We are open to suggestions there though.

We have also created an authoring tool – here you can create your own forms for data collection which allow you to set up fields, mandatory or optional, you can provide hints for filling in those forms. The idea is that the forms should minimise effort in the field. You can collect data, use scales, include images etc. There are many options. And once your data is collected you can synch to Dropbox, you can use it, you can export it, and you can work on it.

So, for the complicated part. I will author a form, you will log into it, you will collect data, we will come back and visualise that data on Google Earth. It worked with the last group so fingers crossed. The last group collected pictures of bicycles – which was quite easy – what should we look at? Suggestion: Buildings and number of floors in said buildings. We will do that as a range, which will be a drop down box. And we can gather building fabric – a multi choice selector. And we will include an image capture.

So I have saved that. And that had synched to my dropbox account. So you will log in to that account and be able to access that form… and that having been done we shall head outside…

[cue a short excursion across Bloomsbury]

So, basically, that’s everything you need to do. The files just get synched to Dropbox. Then there is a record viewer in the Authoring tool. You can view the records, you can tweak the records, you can make any changes or corrections required here. You could change the picture – to a better angle for instance – but that’s trickier. It is probably better to create the extra point and delete the old one. The next thing to do is export our data. We will export as KML. And then just load it into Google Earth. You can then view the data and images.

So, in half an hour we have installed the app, used the app, edited the data and visualised the data. That’s not bad.

Q&A

Q) Does the date and time stamp carry with those records?

A) Yes, it is captured but we could expose that better.

Q) Is it UK only?

A) At the moment it is but we want to make it work globally but the issue is how best to do that. But we are considering Overlays which would let you connect up that mapping. But we wanted it to mirror the scope of Digimap – hence FieldTrip GB.

Closing Remarks – Emma Diffley, EDINA

Thank you so much for coming along, we know it’s a long way for some of you. We had some really interesting ideas from Shelley and her colleagues – we learned the phrase “BIM-ed up”. And myself and my colleagues really appreciated the opportunity to talk to you all during lunch time, to find out what you are doing. And I know we bang on about feedback but your feedback and comments are the best evidence to support and justify our existence. We really do appreciate that feedback.

And that’s #geoforum2013 finished. Huge thanks to all who came along in person and all who have been following online. We really appreciate your time and input.

EmailShare

Geoforum 2013 on the 20th June: Don’t miss out!

Digimap LogoJune the 20th is fast approaching so make sure you don’t miss out on your place at Geoforum, the Congress Centre, Great Junction Street, London:

Geoforum 2013

Congress Centre LogoHighlights of this year’s event include a fascinating key note speaker in Shelley Mosco, who will be showing us how Digimap data is used in landscape architecture.  Shelley will also explaining how our data can be used for Building Information Management (BIM) as this is becoming a compulsory regulation in the construction industry.

Fieldtrip GBWe will be taking you outside to show you the new FieldtripGB app, so don’t forget your smart phone if you have one. We think that you and your students will find this a very valuable resource for the collection of data in the field.

New Roam PreviewThere will be a sneak preview of the important changes that are happening to the Roam interfaces over the summer. The Roams are having some great new features added and a new easier to use layout, don’t miss out on this chance to get a first look.

Quantum GISThere will also be an overview on what OpenSource GIS and OpenData resources you should be looking at for teaching and research. These resources are becoming increasingly important, and requests for assistance from the EDINA help desk are on the rise.

All the details including the booking form can be found here:

http://edina.ac.uk/events/geoforum2013/

We look forward to seeing you there!

EmailShare