nesta:net – a new intranet for Nesta

Nesta

I’ve been working with Nesta to put a new intranet in place. It’s been a smooth project, delivered on time and ticking all the requirements boxes.

But I’ve written enough blog posts showing how organisations are saving money by deploying GovIntranet. This time, I thought I’d share some of the aspects that we at Agento have learned running a project as a small business. Continue reading “nesta:net – a new intranet for Nesta”

Parliament Week website. User experience design in action.

You know that a project is going to go well when the first paragraph of your client’s brief calls for ‘focusing on key user needs and using open source platforms and technologies to deliver improved value for money.’

I wrote a case study about the new Parliament Week website, on the Helpful Technology site. Please take a look at it for the main information about the project. On a more personal note, I wanted to write more about what went on behind the scenes during the rebuild of this website.

Parliament Week is an annual, week-long set of events around the UK that has taken place for the past few years. The team who organise the whole programme had to cope with an inflexible CMS for their website, in addition to all the administrative tasks of running such a site; liaising with partners, managing online submissions and updating manual spreadsheets of partners and events.

Here are some snapshots from the Wayback Machine for the past 3 years:

After our first meeting with the team, our interface designer drew up some wireframes to illustrate our proposed layout and functionality for the new website. After we discussed these with the PW team, we agreed appropriate changes and built a working prototype in WordPress, which we tested with members of the public. We made changes based on the results and then applied new visual designs to the prototype site, used for a second round of user tests.

Homepage

We designed the new homepage wireframes, copying the carousel from the old site, with space for a full-width image to highlight events. The team wanted partners to be able to submit their own events on the site. With hundreds of events anticipated, expecting partners to upload a high resolution image in large enough dimensions to fit a big desktop screen, was a little over-ambitious on our part. And we don’t like carousels, anyway. So we replaced this main section of the homepage with metro boxes, which we had used successfully on other websites. I think that using fixed boxes is better than a moving carousel. Boxes show everything at once. They don’t move without cause.

After the first round of user testing, the PW team wanted to make it easier for partners to get on board and so for the second round of user tests (with the visual design in place) we replaced the space for blog posts with an area for partners. There wouldn’t be many up to date blog posts on the site when the new site launched.

We also made a few usability tweaks. Some people didn’t scroll. They thought the top 6 boxes were the homepage. This didn’t happen in the prototype testing, but the boxes in the visual design appeared in a solid band of colour and on desktop monitors, it filled the screen nicely, but some people didn’t realise that there was more beneath. With a bit of resizing and some jaunty angles, this improved in subsequent tests.

Events

The wireframe of the events page contained a full-width, interactive Google map. We used this in the first rounds of user testing. It soon became clear that people had trouble scrolling. With a wheelie mouse, people got stuck while trying to scroll the page and ended up scrolling the map and making it zoom in and out. We rearranged the layout for the second round of user tests.

We also needed to come up with a way of showing events that didn’t last for a single day, but which straddled days, weeks or even months. But Parliament Week is only on for a week, I hear you say. However, some organisations have longer promotions that last for a month or two before the actual week. So we split the events listing page into 2 columns showing single day events on the left and longer events on the right. Single day events list in date order and as events pass they will drop off into the archive. Longer events list in order of those ending soonest.

The interface allows people to filter the events, down to a single day. In this case, the left hand column shows all the events on that day and the right hand columns shows all events which are taking place on that day but which might have started weeks earlier or finish days later.

And we had the same issue as the homepage with the “below the fold” problem. After filtering events on a location and getting the results, people thought that the map next to the filter criteria was all that was available. They didn’t scroll down the page to see the resulting list of filtered events. The positioning of the results area felt like an end of page boundary. Again, a bit of resizing and literal signposting improved this.

I thought “below the fold” had gone away. I thought we didn’t have to worry about that golden line any more, because people scroll. And of course, they do. But it just shows how design can either help or hinder the user flow depending on the cognitive aspects.

Event submission form

Prototype event submission form
Prototype event submission form
Final event sumb
Final event submission form

In the initial round of user testing, people needed a little more help with filling out the form. They made incorrect assumptions about the address, entering their organisation address instead of the event venue and they were confused about dates. The new form, while having more text, aims to be more helpful.

After launch

The Parliament Week 2014 website is now gathering momentum with under 2 months left until the week of events this year. We launched with a few events already in place and it has been satisfying to see new pins appearing on the events map as partners sign up and submit their events.

The online partner registration and event submission forms are streamlining the former administrative processes. The team now just have to tick a box and publish an event. They can export up to date spreadsheets of the data. And notification emails are handled automatically.

And the team will soon be reaping their value for money rewards again, as we work with them to reuse the custom WordPress theme for a similar website, saving substantial development time and having learned a few lessons along the way. This time the website is based around tea parties and commemorating 800 years since the Magna Carta – ‘The Great Charter of the Liberties of England.’  Hurrah!

 

Mobile usability: Usability Week day 1

Just got back from day 1 of Usability Week in Edinburgh. Thumbs up for todays session on designing websites for mobile usability, by Amy Schade, who got us through 188 slides and kept it fascinating all the way.

I’ve been working in the field of usability and user experience for over 10 years now but I don’t have much professional usability experience when it comes to mobile. Today’s session was informative and useful. I’ve come away brimming with ideas and also feeling a little chuffed at how we have done with our freshly released version of the website which has been designed for mobile users in mind. I was a little concerned about whether we had made the right choices with some points in the mobile design. But after today I feel that we’ve made a very good start in getting the website into the mobile arena.

Homepage hub
Homepage hub

Responsive design

The choices when designing a mobile website are whether to do nothing and let your website look exactly the same but smaller on a mobile phone, to go for a full-on and separate mobile version of the site or to use responsive design which allows pages to reformat themselves depending on the device used to view them. Of course, there are apps and web apps – but we’re talking websites here. We opted for responsive design so that we can repurpose the same website on desktop, tablet and smartphone devices. This is convenient for us, with limited resources for maintaining several different versions of a website. But on the down side, it does mean that some content remains in the full-website format, when it could be cut down for mobile reading.

From the start of the website redesign project we wanted to retain the ability to view the full website on a mobile phone, in addition to the responsive design version. So it’s good to know that this is recommended for mobile sites. But we have positioned the link at the top of the page and it’s recommended to go at the bottom – which makes sense as it’s not one of the most important things fighting for space at the top.

Navigation

There are 3 popular ways to approach navigation on the homepage of mobile sites; a menu button that expands, a horizontal navigation bar or a homepage hub approach; or a combination of these. We went for the homepage hub approach which lists the main site options on the homepage and is appropriate for sites that have deeper structures with varying topics of content and task-oriented sites. It means that you have to return to the homepage to start looking at another topic or task, which is fine for our users who typically use discrete areas of the site.

Menu link skips to navigation at bottom of page
Menu link skips to navigation
at bottom of page

I was keen to find out if our treatment of regular navigation throughout the rest of the site had been done well. We use a “Menu” link at the top of every page which skips to the menu navigation situated at the bottom of the page. There is a “Top” link situated with this navigation block at the bottom which returns you to the top. This technique was first seen in accessible websites to allow users to jump around the page between navigation and textual content and it’s nice to see the technique used again, showing yet again that designing for accessibility makes for helpful navigation for all types of users. And according to NN/g this is an appropriate application of navigation because it promotes content at the top of the mobile page and still allows people to quickly navigate through the site.

User testing

There was also a session on user testing of mobile websites which gave me some good tips such as how to go about recording testing sessions, what equipment to use and how to avoid pitfalls like recording someone using a mobile phone only to find that all you’ve recorded is the glare from an overhead light on the surface of the phone.

Mobile-optimised prison finder
Mobile-optimised prison finder

Future development

There is more that we can do to the mobile version of the website in terms of flattening navigation structures and minimising clicks (or taps). And maybe even a case for developing web apps for some of the more popular areas that are used by mobile users – things like contacts, job search, find a prison, maps and localised content provision.

I’m very much looking forward to the rest of the week when I’m attending a few more tutorials on mobile and touch, plus some more psychological learnings.

A/B tests on user group signup adverts

To recruit people to our user group we ran a display advert on the website. I created an A/B test using 4 different adverts. One of them was a runaway winner.

The aim of the advert was to encourage people who use the website to signup and become part of the user group. I tested the baseline against 3 variants on other popular pages throughout the website. The baseline advert ran all the time on the homepage but the homepage is not our most visited page, with just 3% of total site visits, compared to our jobs page at 11%.  So the A/B testing ran on the A-Z, contacts, forms and jobs pages as well as several other popular content areas of the site.

Here are the contenders:

Baseline
Baseline
Variety 1
Variety 1
Variety 2
Variety 2
Variety 3
Variety 3

Baseline [blue] The baseline advert was a text box with simple content and a clear call to action.

Variety 1 [yellow] The first variety of the advert was a graphic, showing the website homepage and touting the user group. Wording in the advert hit on the senses (take part, view, hear) and the text was centralised.

Variety 2 [orange] The second variety was similar to the first with the same wording but used a photo of a test tube. The wording was left-justified.

Variety 3 [green] The third variety was text-based, carefully-designed to appear like a badly-written personals ad clipped from a newspaper.

After running the test for a few days I removed one of the poor-performing varieties. After 9 days, I removed the remaining poor-performing advert and ran a follow-up test with the baseline advert and the likely winner. The total test took 12 days.

A/B testing - round 1

The results were conclusive, almost from the start of the test. The personals ad outperformed the other adverts with a conversion rate of 1.33%. According to Google Website Optimizer, this winning advert had a 99.9% chance of outperforming all the others. The baseline was the closest contender with a 0.48% conversion rate.

This is another test that shows how plain text beats graphic content and how users are often blind to snazzy, photographic advertising. On this occasion, both text-based adverts outperformed the photographic adverts. One could still argue that the winning advert is a graphic. But it only depicts words and characters, with a dirty grey background.

Budget usability testing can be fun

User-testing markup Where I work, we don’t have usability labs, eye tracking equipment or even webcams or screen capturing software to test information architecture designs with people. Resorting to budget user testing techniques can still provide valuable insights which in turn create recommendations for improvements.

I’ve had two days of user testing this week, both in-house and out on-site. I’ve met people from the police force, probation officers, lawyers and barristers and more. A good mixed bag of users from different sectors and it’s also nice to do testing where people usually work.

The in-house testing consisted of half hour sessions throughout the day in what seemed like a production line at times, with people queuing outside the testing room door waiting for their slot.

I like to spend time at the beginning of a testing session getting to know the person who is doing the tests. Finding out about their job and which websites they use and the types of information or online services that they need. I never script my testing sessions and always to try to make the testing tasks applicable to the person sitting in front of the screen.

It’s satisfying when people complete a task. It is also satisfying to see people making the same mistakes or getting stuck in the same place during tests. Problems on the site become apparent, which means that we can iron them out before going live.

During the in-house testing, with half an hour for each participant, it got to the stage where people did not want to leave! They were actually having fun trying to complete tasks that I was setting and even though I make it very clear at the start of a session that I am not testing the person, they still go into “game-mode“, enjoying completing the tasks and the occasional bit of role playing. Most people wanted more. Some were disappointed to have to finish.

The most number of tasks that a participant managed to tackle in one session was 16. Tasks typically involve trying to find information on the site, answering questions using the site or performing a function such as ordering a publication. While participants are completing tasks and moving through the navigation, in addition to watching what they are doing, I have to write up what they are doing. One participant remarking on my notes asked me if I was counting the number of clicks. I replied that no, I was more interested in whether they could complete the task and how efficiently they could do it.

Over the years of user testing sessions I have developed a system of shorthand notes that I use. I’ve included some snaps of the notes from the testing session this week which I have highlighted with colours to demonstrate the different symbols.

User-testing markup

Ticks (green) represent where the participants are confident that they have reached the end of the task (I also include a cross if they have not in fact reached the end of the task). Depending on the speed of the participant, I generally try to write longhand names of menus and navigation elements, indented to show hierarchy.

Crosses (red) indicate where participants reach a dead end or give up searching down a particular route. Followed by a hook (yellow) means that they hit the back button. I also use smileys and sad faces (blue) to indicate sentiment.

I use a square block system (orange) to indicate points in the task where participants take a long time in searching or scanning a page. The length of time is represented by the number of sides to the square with the inside being filled with two diagonals when exceptionally long times are noted. I also note any observations made by the participants and encourage them to be verbal in their pursuit of task completion so that I can get a better idea of what they are thinking.

This method of writing up actions and indicating lengths of time, success, failure, sentiment and back tracking means that after the sessions are over, I can mark up each set of tests and get scores for total attempts at a task, correctly completed tasks, passes and back tracking. This helps me to arrive at an overall measurement of effectiveness (how many tasks were completed correctly) and efficiency (how many tasks were completed correctly first time). I also get a clear picture of  the areas which are not working and why.

Testing this week has thrown up several key areas which need to change in order for the site to become more effective. I already changed some areas after day one and noted improvements on the second day. At the same time as face to face user testing has been taking place, I’ve also been running some online tests designed to provide further analytical information that will help to decide on final main navigation menu names.