Lessons Learned

Most of this has been covered in the previous post but it would be good to extract a number of key things that we have learned through the USeD project.

  1. usability can save you time and money during the development of a new application
  2. external consultants can be an effective way of buying in skills if you do not have them “in house”
  3. external consultants can be used to up-skill project staff
  4. however well you think you know your users/sector, engaging with users will always reveal something unexpected
  5. users may be using your service for something other than it’s primary purpose. This may be because they don’t know there is another service that would be better suited, or that your service is the best thing out there that almost does what they want
  6. personas work, even the contrived names such a Explorer Evie or Work-around Walter.  These make it easier to discuss issues and problems with the project team and relate them back to a “real” user.
  7. user testing points out the blindingly obvious which was not obvious until you started testing
  8. you can salvage something from a user test even if it seems to be going badly wrong
  9. you don’t need more than 5-6 user to test an interface, by the 4th person you are uncovering very little in the way of new issues.
  10. write up user tests immediately, important information seeps out of your mind in a short space of time
  11. usability labs need not be expensive
  12. effective documentation makes buy-in from stakeholders much easier.

I think i will leave it there, I may come back to this list and add a couple more items.

Success and how to measure it

So, has the USeD project been successful?  Has it achieved what it set out to? These questions are always hard to answer.  How do you measure success?  Success can be quantitative but is often a subjective metric.   To attempt to answer these questions we will   have to break down success down a bit.

What did we set out to achieve?

In the broadest terms, we probably wanted:

  • a more useable interface for Digimap Data Downloader
  • to develop usability skills in house
  • to promote the use of usability as a tool for effective service development

A more usable interface

Well I think this is a big tick for success.  We believe we have managed to design an interface that fits the needs of our personas and it both easy and intuitive to use. Users who we engaged with during testing stated that the test interface was much easier to use and it was clearer what data you would get back.  Our final version seemed to allow novice users to explore data products and they reported that it helped them learn about spatial data.

We released version 2 of the new download interface as a beta service in December.  This might seem an odd thing to do, to release an interface midway through usability testing, but we wanted some of the functionality it added to be available to the community as soon as possible.  This also allowed us to gather feedback from our users.  Below is an some feedback we have received.

“Data Download Beta beats the old version hands down as far as I‘m concerned. The rapidity with which you can select a map extent and download all of the relevant mapping data in one go is by far much better than the slow and more manual way things used to work. Top notch stuff.â€�  Lecturer – Northumbria University.

So, the interface is easier to use and helps novice users learn about spatial data and standard geospatial terminology.  Making the interface “Learnable” was very important as we know that users may well be using spatial data for the first time when they download data from Digimap.  Many users will return to Digimap to get more data through the course of their studies.  (more? does it make sense?)

Develop usability skills

EDINA doesn’t have a dedicated usability expert.  We engaged with an external usability expert to work with us in the project.  One aim was that the usability expert would mentor the project team so that they could do usability studies by themselves in the future.  This worked well and the usability expert increasingly became an observer, giving feedback and advising on best practice.  It is the aim of geo-services to maintain a link to the usability expert so that we can consult him on usability issues that we do not feel confident dealing with ourselves. In fact, this has already happened. We have discussed how best to conduct usability on mobile apps that we are looking to develop.

Embed Usability in development at EDINA

Although not explicitly mentioned in the project plan, USeD hoped to embed usability as a core part of development work at EDINA.  It is not that EDINA doesn’t recognise usability as an important element of the design process,  rather that time and resources are always tight and it is just “another thing” that needs done.  The USeD project has demonstrated that you can integrate usability to steer the development of a project and this can save time and money.  Knowing who you are developing for and what they want is vital if you want your end product to be useful. The personas highlighted that we knew our users pretty well, however we also found that some of our users were making life more difficult for themselves by not using the best service for the task they wanted to complete.  This is perhaps a failing of us, the service provider.

The USeD project has allowed us to look at how users interact with our interface and revealed some interedting issues.  I don’t think there were any huge problems with our initial interface, but small problems distracted and annoyed the user preventing them getting the data they wanted.  Being able to demonstrate this to developers, perhaps by having the developer watch the user testing, has resulted in a better appreciation of what the user needs or wants from the interface.

The USeD project has demonstrated that usability need not be time consuming and costly. In addition, usability should compliment the work of the graphic designer, making sure their strong visual branding is retained.

The fact that geo-services at EDINA is implementing usability on a new project shows that the USeD project has achieved the first step of embedding usability in product development.

Conclusions

To summarise:

  • we have a new interface that users think is much better than the old one,
  • we have developed in-house usability skills and have fostered a good relationship with a usability consultant that we can call on when needed
  • we have started to use the skills we have gained on other projects within geo-services at EDINA.

That sounds like a success.  What we will do over the coming months is to monitor the use of the old interface versus the new interface. Hopefully users will migrate to the new interface.  In addition, we will monitor feedback from users, particularly when the old interface is removed from the service.

 

Project Recap

With the USeD project drawing to a close it is a good time to recap on what we set out to achieve and how we went about it.

Overview

The USeD project aimed to improve the usability and learnability of a user interface which enabled users to download spatial data from the Digimap service. The current data downloader is a popular service but is perhaps not the most user friendly.  It was designed around the technical constraints of the software of he time (2002) and the requirement that it had to integrate with an inflexible server-side database.

It’s replacement would be designed around the needs of the user but would still have to integrate with a server-side database. However, the new database is more flexible and data extraction far simpler.

The new interface must serve all users from experienced users who know what they want to complete novices who are perhaps less confident working with spatial data.  The interface should therefore be intuitive and learnable, allowing users to explore some of the advanced functionality as they gain confidence. You can read a detailed summary on the  About USeD page.

Persona

The first task was to interview some users and create a set of user personas. 20 users were interviewed and this resulted in 5 distinct personas.  The personas would be used to shape the user requirements and would be used to steer the design of the interface throughout the project.  You can meet our personas on the persona page.

Design Specification

The design specification can be divided into 2 parts; user requirements and a technical specification.  The user requirements were defined from the user requirements.  In the personas we had created a list of “pesron X wants to” and “We would like person X to”.  which made it quite a simple task to put together an initial list of requirements.  We grouped the requirements into:

  1. a user must be able to
  2. a user should be able to
  3. a user could be able to

Grouping the requirements like this gave the engineers an idea of the importance of each requirement which made it easier to justify spending more time implementing small functions that were deemed to be a must. The user requirements documentation can be found here

The technical review focused on the software and libraries that could be used to make the new interface more attractive and interactive. The server side database had already been updated so new tech had to integrate with this.

Prototype or Full Build?

This was a key question in the project. Do we use wire-frame mockups to show different designs or do we use a fully functioning test site?  We went with the fll build as we suspected that there would be issues surrounding the strong visual map window and the expectation of what the used would receive in their order. It was felt that a wire-frame would not address these issues. Building fully functioning test sites involved far more developer time and effort, but it was certainly worth it.

Iterative User Testing

We used task based testing to explore the usability of the new interface.  We started with an expert review from our usability consultant, this caught a number of issues that we had missed. The task based testing engaged with real users. Each user had 45 mins to complete a number of tasks and we tried to have 6 people per session. The interface was then modified between sessions. We ran 3 sessions and saw our “problem points” migrate from the initial screen through the ordering process. This was encouraging as it suggested that users were able to progress further in progressive session before they ran into problems. The user testing is described in detail in a number of post.

Project hand-over

Handover

Handover – Tableatny @ Flickr

At the end of the project we will hand over our findings to the Digimap Service team. The hand-over will be a document that outlines a number of improvements that can be made to the existing interface. Each recommendation will be ranked as either High, Medium or Low.  Each recemondation will address an identified isue in the current interface and will suggest a solution which has been implimented and tested during the USeD project.  Where multiple solutions were trialed, a brief summary of this will be given to justify the final suggestion.

This style of documentation proved to be a very effective way of suggesting improvements to the development team.

 

Results of UI testing on Version 2

So, you think you have a good, usable project which clearly sets out what the user has to do to get what they want…….. and then you do some user testing.  The UI testing on Version two of the downloader was extremely useful, it pointed out many things that we had missed and now seem just so obvious.  This post will outline the main points that emerged from the testing and will describe how we ran the tests them self. But before we start, it is important to remember that the test revealed many positive things about the interface and users thought it was an improvement over the current system.   This post will now concentrate on the negatives but we shouldn’t be too depressed.

Setup

We decided to run this UI testing in a different configuration than we intend to run the tests with external students.  We wanted to allow our usability expert to be able to guide us through the test so that we would conduct the test using best practice.  Viv was to be the “facilitator” and Addy was the “Observer”.  David was observing everything and would provide feedback between test.

We had 5 candidates who would each run through 5 tasks during a 40-50minute period. We left 30 minutes between each test to allow us time to get feedback from David and to discuss the tests.  As it turned out, the day was quite draining and I wouldn’t recommend trying to do more than 6 candidates in a day.  Your brain will be mush by the end of it and you might not get the most out of the final sessions.

Results

The tests went well and we improved as the day went on thanks to feed back from the usability expert David Hamill.  It was certainly useful to have David facilitate a session so that we could observe him in action.

The participants all said that they thought the interface was easy to use and quite straight forward. However, it was clear that most users struggled with the process of

  1. selecting an area of interest
  2. selecting data products
  3. adding these products to the basket
  4. submitting the order

As the primary role of the interface is to allow users to order data this seems to be an area that will need significant investigation before the next iteration.  Other issues that arose during the sessions include:

  • The “Search and Select An Area” still seemed to confuse users.  Some struggled to see that they had to actually select an area in addition to just navigate to the area using the map
  • Basket Button looks busy and is not prominent enough.
  • Download limits not obvious to the user
  • Users often couldn’t recover from minor mistakes and “looked for a reset button” (technically you don’t need a reset button but the users didn’t know this so this needs addressed)
  • Preview Area in the Basket was not all that useful, the popup covered the map which showed the selection. In addition to previewing the geographical extent selected, this should also preview the data product selected.
  • Make the info buttons easier to browse through
  • Add more information to the “Use Tile Name” section, perhaps investigate how we can integrate this with the view grid function on the right of the map window.
  • Add a clear all button to the basket area.

A detailed report of the main issues that emerged during the user testing can be found in the Version 2 Testing Report(pdf).

The testing session was a success on two levels.  Viv and I learnt a great deal about conducting UI tests by having the usability expert present and we identified some key areas of the interface that were causing users problems.  Most of these are glaringly obvious once they have been pointed out to you, but then that is the point of UI testing i suppose!